Sample records for network operating quality

  1. Research on Holographic Evaluation of Service Quality in Power Data Network

    NASA Astrophysics Data System (ADS)

    Wei, Chen; Jing, Tao; Ji, Yutong

    2018-01-01

    With the rapid development of power data network, the continuous development of the Power data application service system, more and more service systems are being put into operation. Following this, the higher requirements for network quality and service quality are raised, in the actual process for the network operation and maintenance. This paper describes the electricity network and data network services status. A holographic assessment model was presented to achieve a comprehensive intelligence assessment on the power data network and quality of service in the operation and maintenance on the power data network. This evaluation method avoids the problems caused by traditional means which performs a single assessment of network performance quality. This intelligent Evaluation method can improve the efficiency of network operation and maintenance guarantee the quality of real-time service in the power data network..

  2. Exploring network operations for data and information networks

    NASA Astrophysics Data System (ADS)

    Yao, Bing; Su, Jing; Ma, Fei; Wang, Xiaomin; Zhao, Xiyang; Yao, Ming

    2017-01-01

    Barabási and Albert, in 1999, formulated scale-free models based on some real networks: World-Wide Web, Internet, metabolic and protein networks, language or sexual networks. Scale-free networks not only appear around us, but also have high qualities in the world. As known, high quality information networks can transfer feasibly and efficiently data, clearly, their topological structures are very important for data safety. We build up network operations for constructing large scale of dynamic networks from smaller scale of network models having good property and high quality. We focus on the simplest operators to formulate complex operations, and are interesting on the closeness of operations to desired network properties.

  3. An Effect of the Co-Operative Network Model for Students' Quality in Thai Primary Schools

    ERIC Educational Resources Information Center

    Khanthaphum, Udomsin; Tesaputa, Kowat; Weangsamoot, Visoot

    2016-01-01

    This research aimed: 1) to study the current and desirable states of the co-operative network in developing the learners' quality in Thai primary schools, 2) to develop a model of the co-operative network in developing the learners' quality, and 3) to examine the results of implementation of the co-operative network model in the primary school.…

  4. Operation quality assessment model for video conference system

    NASA Astrophysics Data System (ADS)

    Du, Bangshi; Qi, Feng; Shao, Sujie; Wang, Ying; Li, Weijian

    2018-01-01

    Video conference system has become an important support platform for smart grid operation and management, its operation quality is gradually concerning grid enterprise. First, the evaluation indicator system covering network, business and operation maintenance aspects was established on basis of video conference system's operation statistics. Then, the operation quality assessment model combining genetic algorithm with regularized BP neural network was proposed, which outputs operation quality level of the system within a time period and provides company manager with some optimization advice. The simulation results show that the proposed evaluation model offers the advantages of fast convergence and high prediction accuracy in contrast with regularized BP neural network, and its generalization ability is superior to LM-BP neural network and Bayesian BP neural network.

  5. 40 CFR 58.12 - Operating schedules.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.12 Operating schedules. State and local... part. Area-specific PAMS operating schedules must be included as part of the PAMS network description... remains once every six days. No less frequently than as part of each 5-year network assessment, the most...

  6. 40 CFR 58.12 - Operating schedules.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.12 Operating schedules. State and local... part. Area-specific PAMS operating schedules must be included as part of the PAMS network description... remains once every six days. No less frequently than as part of each 5-year network assessment, the most...

  7. Operation of International Monitoring System Network

    NASA Astrophysics Data System (ADS)

    Nikolova, Svetlana; Araujo, Fernando; Aktas, Kadircan; Malakhova, Marina; Otsuka, Riyo; Han, Dongmei; Assef, Thierry; Nava, Elisabetta; Mickevicius, Sigitas; Agrebi, Abdelouaheb

    2015-04-01

    The IMS is a globally distributed network of monitoring facilities using sensors from four technologies: seismic, hydroacoustic, infrasound and radionuclide. It is designed to detect the seismic and acoustic waves produced by nuclear test explosions and the subsequently released radioactive isotopes. Monitoring stations transmit their data to the IDC in Vienna, Austria, over a global private network known as the GCI. Since 2013, the data availability (DA) requirements for IMS stations account for quality of the data, meaning that in calculation of data availability data should be exclude if: - there is no input from sensor (SHI technology); - the signal consists of constant values (SHI technology); Even more strict are requirements for the DA of the radionuclide (particulate and noble gas) stations - received data have to be analyzed, reviewed and categorized by IDC analysts. In order to satisfy the strict data and network availability requirements of the IMS Network, the operation of the facilities and the GCI are managed by IDC Operations. Operations has following main functions: - to ensure proper operation and functioning of the stations; - to ensure proper operation and functioning of the GCI; - to ensure efficient management of the stations in IDC; - to provide network oversight and incident management. At the core of the IMS Network operations are a series of tools for: monitoring the stations' state of health and data quality, troubleshooting incidents, communicating with internal and external stakeholders, and reporting. The new requirements for data availability increased the importance of the raw data quality monitoring. This task is addressed by development of additional tools for easy and fast identifying problems in data acquisition, regular activities to check compliance of the station parameters with acquired data by scheduled calibration of the seismic network, review of the samples by certified radionuclide laboratories. The DA for the networks of different technologies in 2014 is: Primary seismic (PS) network - 95.70%, Infrasound network (IS) - 97.68%, Hydroacoustic network (HA) - 88.78%, Auxiliary Seismic - 86.07%; Radionuclide Particulate - 83.01% and Radionuclide Noble Gas -75.06%. IDC's strategy for further improving operations and management of the stations and meeting DA requirements is: - further development of tools and procedures to effectively identify and support troubleshooting of problems by the Station Operators; - effective support to the station operators to develop tailored Operation and Maintenance plans for their stations; - focus on early identification of the raw data quality problems at the station in order to support timely resolution; - extensive training programme for station operators (joined effort of IDC and IMS).

  8. Contrast research of CDMA and GSM network optimization

    NASA Astrophysics Data System (ADS)

    Wu, Yanwen; Liu, Zehong; Zhou, Guangyue

    2004-03-01

    With the development of mobile telecommunication network, users of CDMA advanced their request of network service quality. While the operators also change their network management object from signal coverage to performance improvement. In that case, reasonably layout & optimization of mobile telecommunication network, reasonably configuration of network resource, improvement of the service quality, and increase the enterprise's core competition ability, all those have been concerned by the operator companies. This paper firstly looked into the flow of CDMA network optimization. Then it dissertated to some keystones in the CDMA network optimization, like PN code assignment, calculation of soft handover, etc. As GSM is also the similar cellular mobile telecommunication system like CDMA, so this paper also made a contrast research of CDMA and GSM network optimization in details, including the similarity and the different. In conclusion, network optimization is a long time job; it will run through the whole process of network construct. By the adjustment of network hardware (like BTS equipments, RF systems, etc.) and network software (like parameter optimized, configuration optimized, capacity optimized, etc.), network optimization work can improve the performance and service quality of the network.

  9. Mercury Deposition Network Site Operator Training for the System Blank and Blind Audit Programs

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Lehmann, Christopher M.B.

    2008-01-01

    The U.S. Geological Survey operates the external quality assurance project for the National Atmospheric Deposition Program/Mercury Deposition Network. The project includes the system blank and blind audit programs for assessment of total mercury concentration data quality for wet-deposition samples. This presentation was prepared to train new site operators and to refresh experienced site operators to successfully process and submit system blank and blind audit samples for chemical analysis. Analytical results are used to estimate chemical stability and contamination levels of National Atmospheric Deposition Program/Mercury Deposition Network samples and to evaluate laboratory variability and bias.

  10. ICC '86; Proceedings of the International Conference on Communications, Toronto, Canada, June 22-25, 1986, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.

  11. Chemical, physical, biochemical, and bacteriological characteristics at selected stream sites in Puerto Rico, 1976-77

    USGS Publications Warehouse

    Quinones, F.; Vasquez, Pedro; Pena-Cortes, Rafael

    1978-01-01

    In 1969, the Caribbean District of the U.S. Geological Survey, in cooperation with the Commonwealth of Puerto Rico, initiated the operation of a network to monitor some parameters indicative of water-quality changes at selected stream sites. In 1974, at the request of the Environmental Quality Board of Puerto Rico, the network was modified to conform with the Environmental Protection Agency National Water Quality Surveillance System. The purpose of the present network is to monitor changes in water quality between the upstream and downstream stations. The expanded network consisted of 58 stations. During 1976, five had been discontinued. One other was added late in 1976. Most of the stations in the original network have been maintained, thus providing some degree of continuity. The monitoring stations used in this report are shown on a map and listed in a table. The results of the network operation are summarized for the period July 1976 to August 1977. (Woodard-USGS)

  12. Upper Washita River experimental watersheds: Data screening procedure for data quality assurance

    USDA-ARS?s Scientific Manuscript database

    The presence of non-stationary condition in long term hydrologic observation networks are associated with natural and anthropogenic stressors or network operation problems. Detection and identification of network operation drivers is fundamental in hydrologic investigation due to changes in systemat...

  13. Data from selected U.S. Geological Survey National Stream Water-Quality Networks (WQN)

    USGS Publications Warehouse

    Alexander, Richard B.; Slack, J.R.; Ludtke, A.S.; Fitzgerald, K.K.; Schertz, T.L.; Briel, L.I.; Buttleman, K.P.

    1996-01-01

    This CD-ROM set contains data from two USGS national stream water-quality networks, the Hydrologic Benchmark Network (HBN) and the National Stream Quality Accounting Network (NASQAN), operated during the past 30 years. These networks were established to provide national and regional descriptions of stream water-quality conditions and trends, based on uniform monitoring of selected watersheds throughout the United States, and to improve our understanding of the effects of the natural environment and human activities on water quality. The HBN, consisting of 63 relatively small, minimally disturbed watersheds, provides data for investigating naturally induced changes in streamflow and water quality and the effects of airborne substances on water quality. NASQAN, consisting of 618 larger, more culturally influenced watersheds, provides information for tracking water-quality conditions in major U.S. rivers and streams.

  14. Network-aware scalable video monitoring system for emergency situations with operator-managed fidelity control

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Nightingale, James M.; Wang, Qi; Grecos, Christos

    2014-05-01

    In emergency situations, the ability to remotely monitor unfolding events using high-quality video feeds will significantly improve the incident commander's understanding of the situation and thereby aids effective decision making. This paper presents a novel, adaptive video monitoring system for emergency situations where the normal communications network infrastructure has been severely impaired or is no longer operational. The proposed scheme, operating over a rapidly deployable wireless mesh network, supports real-time video feeds between first responders, forward operating bases and primary command and control centers. Video feeds captured on portable devices carried by first responders and by static visual sensors are encoded in H.264/SVC, the scalable extension to H.264/AVC, allowing efficient, standard-based temporal, spatial, and quality scalability of the video. A three-tier video delivery system is proposed, which balances the need to avoid overuse of mesh nodes with the operational requirements of the emergency management team. In the first tier, the video feeds are delivered at a low spatial and temporal resolution employing only the base layer of the H.264/SVC video stream. Routing in this mode is designed to employ all nodes across the entire mesh network. In the second tier, whenever operational considerations require that commanders or operators focus on a particular video feed, a `fidelity control' mechanism at the monitoring station sends control messages to the routing and scheduling agents in the mesh network, which increase the quality of the received picture using SNR scalability while conserving bandwidth by maintaining a low frame rate. In this mode, routing decisions are based on reliable packet delivery with the most reliable routes being used to deliver the base and lower enhancement layers; as fidelity is increased and more scalable layers are transmitted they will be assigned to routes in descending order of reliability. The third tier of video delivery transmits a high-quality video stream including all available scalable layers using the most reliable routes through the mesh network ensuring the highest possible video quality. The proposed scheme is implemented in a proven simulator, and the performance of the proposed system is numerically evaluated through extensive simulations. We further present an in-depth analysis of the proposed solutions and potential approaches towards supporting high-quality visual communications in such a demanding context.

  15. Applying Web-Based Tools for Research, Engineering, and Operations

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  16. A network for continuous monitoring of water quality in the Sabine River basin, Texas and Louisiana

    USGS Publications Warehouse

    Blakey, J.F.; Skinner, P.W.

    1973-01-01

    Level I operations at a proposed site would monitor current and potential problems, water-quality changes in subreaches of streams, and water-quality trends in time and place. Level II operations would monitor current or potential problems only. An optimum system would require Level I operations at all nine stations. A minimum system would require Level II operations at most of the stations.

  17. Updated operational protocols for the U.S. Geological Survey Precipitation Chemistry Quality Assurance Project in support of the National Atmospheric Deposition Program

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn

    2017-02-06

    The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance Project (PCQA) for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN) and National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Since 1978, various programs have been implemented by the PCQA to estimate data variability and bias contributed by changing protocols, equipment, and sample submission schemes within NADP networks. These programs independently measure the field and laboratory components which contribute to the overall variability of NADP wet-deposition chemistry and precipitation depth measurements. The PCQA evaluates the quality of analyte-specific chemical analyses from the two, currently (2016) contracted NADP laboratories, Central Analytical Laboratory and Mercury Analytical Laboratory, by comparing laboratory performance among participating national and international laboratories. Sample contamination and stability are evaluated for NTN and MDN by using externally field-processed blank samples provided by the Branch of Quality Systems. A colocated sampler program evaluates the overall variability of NTN measurements and bias between dissimilar precipitation gages and sample collectors.This report documents historical PCQA operations and general procedures for each of the external quality-assurance programs from 2007 to 2016.

  18. Current progress on GSN data quality evaluation

    NASA Astrophysics Data System (ADS)

    Davis, J. P.; Gee, L. S.; Anderson, K. R.; Ahern, T. K.

    2012-12-01

    We discuss ongoing work to assess and improve the quality of data collected from instruments deployed at the 150+ stations of the Global Seismographic Network (GSN). The USGS and the IRIS Consortium are coordinating efforts to emphasize data quality following completion of the major installation phase of the GSN and recapitalization of the network's data acquisition systems, ancillary equipment and many of the secondary seismic sensors. We highlight here procedures adopted by the network's operators, the USGS' Albuquerque Seismological Laboratory (ASL) and UCSD's Project IDA, to ensure that the quality of the waveforms collected is maximized, that published metadata accurately reflect the instrument response of the data acquisitions systems, and that the data users are informed of the status of the GSN data quality. Additional details can be found at the GSN Quality webpage (www.iris.edu/hq/programs/gsn/quality). The GSN network operation teams meet frequently to share information and techniques. While custom software developed by each network operator to identify and track known problems remains important, recent efforts are providing new resources and tools to evaluate waveform quality, including analysis provided by the Lamont Waveform Quality Center (www.ldeo.columbia.edu/~ekstrom/Projects/WQC.html) and synthetic seismograms made available through Princeton University's Near Real Time Global Seismicity Portal ( http://global.shakemovie.princeton.edu/home.jsp ) and developments such as the IRIS DMS's MUSTANG and the ASL's Data Quality Analyzer. We conclude with the concept of station certification, a comprehensive overview of a station's performance that we have developed to communicate to data users the state of data- and metadata quality. As progress is made to verify the response and performance of existing systems as well as analysis of past calibration signals and waveform data, we will update information on the GSN web portals to apprise users of the condition of each GSN station's data.

  19. A Social Operational Model of Urban Adolescents' Tobacco and Substance Use: A Mediational Analysis

    ERIC Educational Resources Information Center

    Mason, Michael J.; Mennis, Jeremy; Schmidt, Christopher D.

    2011-01-01

    This study tested a mediation model of the relationship with tobacco use, social network quality (level of risk or protection in a network), and substance use (alcohol and/or illicit drugs) with a sample of 301 urban adolescents. It was theorized that social network quality would mediate the effect of tobacco use, accounting for PTSD symptoms and…

  20. U.S. Geological Survey Catskill/Delaware Water-Quality Network: Water-Quality Report Water Year 2006

    USGS Publications Warehouse

    McHale, Michael R.; Siemion, Jason

    2010-01-01

    The U.S. Geological Survey operates a 60-station streamgaging network in the New York City Catskill/Delaware Water Supply System. Water-quality samples were collected at 13 of the stations in the Catskill/Delaware streamgaging network to provide resource managers with water-quality and water-quantity data from the water-supply system that supplies about 85 percent of the water needed by the more than 9 million residents of New York City. This report summarizes water-quality data collected at those 13 stations plus one additional station operated as a part of the U.S. Environmental Protection Agency's Regional Long-Term Monitoring Network for the 2006 water year (October 1, 2005 to September 30, 2006). An average of 62 water-quality samples were collected at each station during the 2006 water year, including grab samples collected every other week and storm samples collected with automated samplers. On average, 8 storms were sampled at each station during the 2006 water year. The 2006 calendar year was the second warmest on record and the summer of 2006 was the wettest on record for the northeastern United States. A large storm on June 26-28, 2006, caused extensive flooding in the western part of the network where record peak flows were measured at several watersheds.

  1. 40 CFR 58.11 - Network technical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 5 2010-07-01 2010-07-01 false Network technical requirements. 58.11... (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.11 Network technical requirements. (a)(1... A to this part when operating the SLAMS networks. (2) Beginning January 1, 2009, State and local...

  2. 40 CFR 58.11 - Network technical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Network technical requirements. 58.11... (CONTINUED) AMBIENT AIR QUALITY SURVEILLANCE Monitoring Network § 58.11 Network technical requirements. (a)(1... A to this part when operating the SLAMS networks. (2) Beginning January 1, 2009, State and local...

  3. Using heuristic algorithms for capacity leasing and task allocation issues in telecommunication networks under fuzzy quality of service constraints

    NASA Astrophysics Data System (ADS)

    Huseyin Turan, Hasan; Kasap, Nihat; Savran, Huseyin

    2014-03-01

    Nowadays, every firm uses telecommunication networks in different amounts and ways in order to complete their daily operations. In this article, we investigate an optimisation problem that a firm faces when acquiring network capacity from a market in which there exist several network providers offering different pricing and quality of service (QoS) schemes. The QoS level guaranteed by network providers and the minimum quality level of service, which is needed for accomplishing the operations are denoted as fuzzy numbers in order to handle the non-deterministic nature of the telecommunication network environment. Interestingly, the mathematical formulation of the aforementioned problem leads to the special case of a well-known two-dimensional bin packing problem, which is famous for its computational complexity. We propose two different heuristic solution procedures that have the capability of solving the resulting nonlinear mixed integer programming model with fuzzy constraints. In conclusion, the efficiency of each algorithm is tested in several test instances to demonstrate the applicability of the methodology.

  4. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  5. Progress and lessons learned from water-quality monitoring networks

    USGS Publications Warehouse

    Myers, Donna N.; Ludtke, Amy S.

    2017-01-01

    Stream-quality monitoring networks in the United States were initiated and expanded after passage of successive federal water-pollution control laws from 1948 to 1972. The first networks addressed information gaps on the extent and severity of stream pollution and served as early warning systems for spills. From 1965 to 1972, monitoring networks expanded to evaluate compliance with stream standards, track emerging issues, and assess water-quality status and trends. After 1972, concerns arose regarding the ability of monitoring networks to determine if water quality was getting better or worse and why. As a result, monitoring networks adopted a hydrologic systems approach targeted to key water-quality issues, accounted for human and natural factors affecting water quality, innovated new statistical methods, and introduced geographic information systems and models that predict water quality at unmeasured locations. Despite improvements, national-scale monitoring networks have declined over time. Only about 1%, or 217, of more than 36,000 US Geological Survey monitoring sites sampled from 1975 to 2014 have been operated throughout the four decades since passage of the 1972 Clean Water Act. Efforts to sustain monitoring networks are important because these networks have collected information crucial to the description of water-quality trends over time and are providing information against which to evaluate future trends.

  6. A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network

    NASA Astrophysics Data System (ADS)

    Lussana, C.; Ranci, M.; Uboldi, F.

    2012-04-01

    In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.

  7. The national stream quality accounting network: A flux-basedapproach to monitoring the water quality of large rivers

    USGS Publications Warehouse

    Hooper, R.P.; Aulenbach, Brent T.; Kelly, V.J.

    2001-01-01

    Estimating the annual mass flux at a network of fixed stations is one approach to characterizing water quality of large rivers. The interpretive context provided by annual flux includes identifying source and sink areas for constituents and estimating the loadings to receiving waters, such as reservoirs or the ocean. Since 1995, the US Geological Survey's National Stream Quality Accounting Network (NASQAN) has employed this approach at a network of 39 stations in four of the largest river basins of the USA: The Mississippi, the Columbia, the Colorado and the Rio Grande. In this paper, the design of NASQAN is described and its effectiveness at characterizing the water quality of these rivers is evaluated using data from the first 3 years of operation. A broad range of constituents was measured by NASQAN, including trace organic and inorganic chemicals, major ions, sediment and nutrients. Where possible, a regression model relating concentration to discharge and season was used to interpolate between chemical observations for flux estimation. For water-quality network design, the most important finding from NASQAN was the importance of having a specific objective (that is, estimating annual mass flux) and, from that, an explicitly stated data analysis strategy, namely the use of regression models to interpolate between observations. The use of such models aided in the design of sampling strategy and provided a context for data review. The regression models essentially form null hypotheses for concentration variation that can be evaluated by the observed data. The feedback between network operation and data collection established by the hypothesis tests places the water-quality network on a firm scientific footing.

  8. Why do electricity policy and competitive markets fail to use advanced PV systems to improve distribution power quality?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McHenry, Mark P.; Johnson, Jay; Hightower, Mike

    The increasing pressure for network operators to meet distribution network power quality standards with increasing peak loads, renewable energy targets, and advances in automated distributed power electronics and communications is forcing policy-makers to understand new means to distribute costs and benefits within electricity markets. Discussions surrounding how distributed generation (DG) exhibits active voltage regulation and power factor/reactive power control and other power quality capabilities are complicated by uncertainties of baseline local distribution network power quality and to whom and how costs and benefits of improved electricity infrastructure will be allocated. DG providing ancillary services that dynamically respond to the networkmore » characteristics could lead to major network improvements. With proper market structures renewable energy systems could greatly improve power quality on distribution systems with nearly no additional cost to the grid operators. Renewable DG does have variability challenges, though this issue can be overcome with energy storage, forecasting, and advanced inverter functionality. This paper presents real data from a large-scale grid-connected PV array with large-scale storage and explores effective mitigation measures for PV system variability. As a result, we discuss useful inverter technical knowledge for policy-makers to mitigate ongoing inflation of electricity network tariff components by new DG interconnection requirements or electricity markets which value power quality and control.« less

  9. Why do electricity policy and competitive markets fail to use advanced PV systems to improve distribution power quality?

    DOE PAGES

    McHenry, Mark P.; Johnson, Jay; Hightower, Mike

    2016-01-01

    The increasing pressure for network operators to meet distribution network power quality standards with increasing peak loads, renewable energy targets, and advances in automated distributed power electronics and communications is forcing policy-makers to understand new means to distribute costs and benefits within electricity markets. Discussions surrounding how distributed generation (DG) exhibits active voltage regulation and power factor/reactive power control and other power quality capabilities are complicated by uncertainties of baseline local distribution network power quality and to whom and how costs and benefits of improved electricity infrastructure will be allocated. DG providing ancillary services that dynamically respond to the networkmore » characteristics could lead to major network improvements. With proper market structures renewable energy systems could greatly improve power quality on distribution systems with nearly no additional cost to the grid operators. Renewable DG does have variability challenges, though this issue can be overcome with energy storage, forecasting, and advanced inverter functionality. This paper presents real data from a large-scale grid-connected PV array with large-scale storage and explores effective mitigation measures for PV system variability. As a result, we discuss useful inverter technical knowledge for policy-makers to mitigate ongoing inflation of electricity network tariff components by new DG interconnection requirements or electricity markets which value power quality and control.« less

  10. Preface; Water quality of large U.S. rivers; results from the U.S. Geological Survey's National Stream Quality Accounting Network

    USGS Publications Warehouse

    Hirsch, Robert M.; Hooper, Richard P.; Kelly, Valerie J.

    2001-01-01

    The mission of the US Geological Survey (USGS) is to assess the quantity and quality of the earth resources of the USA and to provide information that will assist resource managers and policymakers at federal, state and local levels in making sound decisions. Characterizing the water quality of the largest rivers of the USA is a daunting prospect, especially given the resources available for the task. The most effective approach is uncertain and is legitimately a research topic. The National Stream Quality Accounting Network (NASQAN) was redesigned in 1995 to estimate the annual mass flux of constituents at a network of fixed stations in the Mississippi, Rio Grande, Colorado, and Columbia River basins. This special volume of Hydrological Processes contains a series of papers evaluating the data collected by NASQAN during its first 3 years of operation under this design. The NASQAN network complements other USGS national programs that are designed to address water quality at different scales. The National Water-Quality Assessment Program (Hirsch et al., 1988) is designed around river basins of 10 000 to 100 000 km2 (versus these NASQAN basins, which are 650 000 to 3 100 000 km2 at their most downstream stations). The USGS also operates the Hydrologic Benchmark Network that is focused on relatively pristine basins of only 10 to 100 km2 (Mast and Turk, 1999a,b; Clark et al., 2000; Mast et al., 2000).

  11. Quality control and gap-filling of PM10 daily mean concentrations with the best linear unbiased estimator.

    PubMed

    Sozzi, R; Bolignano, A; Ceradini, S; Morelli, M; Petenko, I; Argentini, S

    2017-10-15

    According to the European Directive 2008/50/CE, the air quality assessment consists in the measurement of the concentration fields, and the evaluation of the mean, number of exceedances, etc. of some chemical species dangerous to human health. The measurements provided by an air quality ground-based monitoring network are the main information source but the availability of these data is often limited by several technical and operational problems. In this paper, the best linear unbiased estimator (BLUE) is proposed to validate the pollutant concentration values and to fill the gaps in the measurement of time series collected by a monitoring network. The BLUE algorithm is tested using the daily mean concentrations of particulate matter having aerodynamic diameter less than 10 μ (PM 10 concentrations) measured by the air quality monitoring sensors operating in the Lazio Region in Italy. The comparison between the estimated and measured data evidences an error comparable with the measurement uncertainty. Due to its simplicity and reliability, the BLUE will be used in the routine quality test procedures of the Lazio air quality monitoring network measurements.

  12. Method and apparatus for in-process sensing of manufacturing quality

    DOEpatents

    Hartman, Daniel A [Santa Fe, NM; Dave, Vivek R [Los Alamos, NM; Cola, Mark J [Santa Fe, NM; Carpenter, Robert W [Los Alamos, NM

    2005-02-22

    A method for determining the quality of an examined weld joint comprising the steps of providing acoustical data from the examined weld joint, and performing a neural network operation on the acoustical data determine the quality of the examined weld joint produced by a friction weld process. The neural network may be trained by the steps of providing acoustical data and observable data from at least one test weld joint, and training the neural network based on the acoustical data and observable data to form a trained neural network so that the trained neural network is capable of determining the quality of a examined weld joint based on acoustical data from the examined weld joint. In addition, an apparatus having a housing, acoustical sensors mounted therein, and means for mounting the housing on a friction weld device so that the acoustical sensors do not contact the weld joint. The apparatus may sample the acoustical data necessary for the neural network to determine the quality of a weld joint.

  13. Method and Apparatus for In-Process Sensing of Manufacturing Quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, D.A.; Dave, V.R.; Cola, M.J.

    2005-02-22

    A method for determining the quality of an examined weld joint comprising the steps of providing acoustical data from the examined weld joint, and performing a neural network operation on the acoustical data determine the quality of the examined weld joint produced by a friction weld process. The neural network may be trained by the steps of providing acoustical data and observable data from at least one test weld joint, and training the neural network based on the acoustical data and observable data to form a trained neural network so that the trained neural network is capable of determining themore » quality of a examined weld joint based on acoustical data from the examined weld joint. In addition, an apparatus having a housing, acoustical sensors mounted therein, and means for mounting the housing on a friction weld device so that the acoustical sensors do not contact the weld joint. The apparatus may sample the acoustical data necessary for the neural network to determine the quality of a weld joint.« less

  14. A FORCEnet Framework for Analysis of Existing Naval C4I Architectures

    DTIC Science & Technology

    2003-06-01

    best qualities of humans and computers. f. Information Weapons Information weapons integrate the use of military deception, psychological ...operations, to include electronic warfare, psychological operations, computer network attack, computer network defense, operations security, and military...F/A-18 ( ATARS /SHARP), S-3B (SSU), SH-60 LAMPS (HAWKLINK) and P-3C (AIP, Special Projects). CDL-N consists of two antennas (one meter diameter

  15. A statistical summary of data from the U.S. Geological Survey's national water quality networks

    USGS Publications Warehouse

    Smith, R.A.; Alexander, R.B.

    1983-01-01

    The U.S. Geological Survey Operates two nationwide networks to monitor water quality, the National Hydrologic Bench-Mark Network and the National Stream Quality Accounting Network (NASQAN). The Bench-Mark network is composed of 51 stations in small drainage basins which are as close as possible to their natural state, with no human influence and little likelihood of future development. Stations in the NASQAN program are located to monitor flow from accounting units (subregional drainage basins) which collectively encompass the entire land surface of the nation. Data collected at both networks include streamflow, concentrations of major inorganic constituents, nutrients, and trace metals. The goals of the two water quality sampling programs include the determination of mean constituent concentrations and transport rates as well as the analysis of long-term trends in those variables. This report presents a station-by-station statistical summary of data from the two networks for the period 1974 through 1981. (Author 's abstract)

  16. EarthScope's Transportable Array: Advancing Eastward

    NASA Astrophysics Data System (ADS)

    Busby, R. W.; Vernon, F.; Newman, R. L.; Astiz, L.

    2006-12-01

    EarthScope's Transportable Array has installed more than 200 high-quality broadband seismic stations over the last 3 years in the western US. These stations have a nominal spacing of 70 km and are part of an eventual 400 station array that migrates from west to east at a rate of 18 stations per month. The full 400 stations will be operating by September 2007. Stations have a residence time of about 2 years before being relocated to the next site. Throughout the continental US, 1623 sites are expected to be occupied. Standardized procedures and protocols have been developed to streamline all aspects of Transportable Array operations, from siting to site construction and installation to equipment purchasing and data archiving. Earned Value Management tools keep facility installation and operation on budget and schedule. A diverse, yet efficient, infrastructure installs and maintains the Transportable Array. Sensors, dataloggers, and other equipment are received and tested by the IRIS PASSCAL Instrument Center and shipped to regional storage facilities. To engage future geoscientists in the project, students are trained to conduct field and analytical reconnaissance to identify suitable seismic station sites. Contract personnel are used for site verification; vault construction; and installation of sensors, power, and communications systems. IRIS staff manages permitting, landowner communications, and station operations and maintenance. Seismic signal quality and metadata are quality-checked at the Array Network Facility at the University of California-San Diego and simultaneously archived at the IRIS Data Management Center in Seattle. Station equipment has been specifically designed for low power, remote, unattended operation and uses diverse two-way IP communications for real-time transmission. Digital cellular services, VSAT satellite, and commercial DSL, cable or wireless transport services are employed. Automatic monitoring of status, signal quality and earthquake event detection as well as operational alarms for low voltage and water intrusion are performed by a robust data acquisition package. This software is coupled with a host of network management tools and display managers operated by the Array Network Facility to allow managers, field personnel, and network operations staff to visualize array performance in real-time and to access historical information for diagnostics. Current data recording proficiency is 99.1%, with real-time telemetry averaging about 91%. EarthScope, IRIS and the USGS are working with regional seismic network operators, both existing and newly formed, to transition some of the Transportable Array stations into regional network assets. Each region has unique circumstances and interested parties are invited to exchange ideas on how this might be accomplished in their area. Contact busby@iris.edu for more information.

  17. Capturing the Interplay of Dynamics and Networks through Parameterizations of Laplacian Operators

    DTIC Science & Technology

    2016-08-24

    important vertices and communities in the network. Specifically, for each dynamical process in this framework, we define a centrality measure that...vertices as a potential cluster (or community ) with respect to this process. We show that the subset-quality function generalizes the traditional conductance...compare the different perspectives they create on network structure. Subjects Network Science and Online Social Networks Keywords Network, Community

  18. Some applications of remote sensing in atmospheric monitoring programs

    NASA Technical Reports Server (NTRS)

    Heller, A. N.; Bryson, J. C.; Vasuki, N. C.

    1972-01-01

    The applications of remote sensing in atmospheric monitoring programs are described. The organization, operations, and functions of an air quality monitoring network at New Castle County, Delaware is discussed. The data obtained by the air quality monitoring network ground stations and the equipment used to obtain atmospheric data are explained. It is concluded that correlation of the information obtained by the network will make it possible to anticipate air pollution problems in the Chesapeake Bay area before a crisis develops.

  19. Index of surface-water stations in Texas, January 1986

    USGS Publications Warehouse

    Carrillo, E.R.; Buckner, H.D.; Rawson, Jack

    1986-01-01

    As of January 1, 1986, the surface-water data-collection network in Texas operated by the U.S. Geological Survey included 386 streamflow, 87 reservoir-contents, 33 stage, 10 crest-stage partial-record, 8 periodic discharge through range, 38 flood-hydrograph partial-record, 11 flood-profile partial-record , 36 low-flow partial-record 2 tide-level, 45 daily chemical-quality, 23 continuous-recording water-quality, 97 periodic biological, 19 lake surveys, 174 periodic organic- and (or) nutrient, 4 periodic insecticide, 58 periodic pesticide, 22 automatic sampler, 157 periodic minor elements, 141 periodic chemical-quality, 108 periodic physical-organic, 14 continuous-recording three- or four-parameter water-quality, 3 sediment, 39 periodic sediment, 26 continuous-recording temperature, and 37 national stream-quality accounting network stations were in operation. Tables describing the station location, type of data collected, and place where data are available are included, as well as maps showing the location of most of the stations. (USGS)

  20. How does network design constrain optimal operation of intermittent water supply?

    NASA Astrophysics Data System (ADS)

    Lieb, Anna; Wilkening, Jon; Rycroft, Chris

    2015-11-01

    Urban water distribution systems do not always supply water continuously or reliably. As pipes fill and empty, pressure transients may contribute to degraded infrastructure and poor water quality. To help understand and manage this undesirable side effect of intermittent water supply--a phenomenon affecting hundreds of millions of people in cities around the world--we study the relative contributions of fixed versus dynamic properties of the network. Using a dynamical model of unsteady transition pipe flow, we study how different elements of network design, such as network geometry, pipe material, and pipe slope, contribute to undesirable pressure transients. Using an optimization framework, we then investigate to what extent network operation decisions such as supply timing and inflow rate may mitigate these effects. We characterize some aspects of network design that make them more or less amenable to operational optimization.

  1. Service offerings and interfaces for the ACTS network of earth stations

    NASA Technical Reports Server (NTRS)

    Coney, T. A.; Dobyns, T. R.; Chitre, D. M.; Lindstrom, R.

    1988-01-01

    The NASA Advanced Communications Technology Satellite (ACTS) will use a network of about 20 earth stations to operate as a Mode 1 network. This network will support two ACTS program objectives: to verify the technical performance of ACTS Mode 1 operation in GEO and to demonstrate the types and quality of services that can be provided by an ACTS Mode 1 communications system. The terrestrial interface design is a critical element in assuring that these network earth stations will meet the objectives. In this paper, the applicable terrestrial interface design requirements, the resulting interface specifications, and the associated terrestrial input/output hardware are discussed. A functional block diagram of a network earth station is shown.

  2. Definition of air quality measurements for monitoring space shuttle launches

    NASA Technical Reports Server (NTRS)

    Thorpe, R. D.

    1978-01-01

    A description of a recommended air quality monitoring network to characterize the impact on ambient air quality in the Kennedy Space Center (KSC) (area) of space shuttle launch operations is given. Analysis of ground cloud processes and prevalent meteorological conditions indicates that transient HCl depositions can be a cause for concern. The system designed to monitor HCl employs an extensive network of inexpensive detectors combined with a central analysis device. An acid rain network is also recommended. A quantitative measure of projected minimal long-term impact involves the limited monitoring of NOx and particulates. All recommended monitoring is confined ti KSC property.

  3. Information Product Quality in Network Centric Operations

    DTIC Science & Technology

    2005-05-01

    Signori et al.’ s NCOCF .......................................................................................................1 Figure 2...NCW Conceptual Framework Figure 1. Signori et al.’ s NCOCF 1 perspective, having led to what is currently known as the Network Centric Operations...following equation: T QS δ≥∆ , where is the change in entropy, is the change in heat energy and T is some constant S ∆ Qδ 7 temperature. Whenever heat

  4. Asset deterioration and discolouration in water distribution systems.

    PubMed

    Husband, P S; Boxall, J B

    2011-01-01

    Water Distribution Systems function to supply treated water safe for human consumption and complying with increasingly stringent quality regulations. Considered primarily an aesthetic issue, discolouration is the largest cause of customer dissatisfaction associated with distribution system water quality. Pro-active measures to prevent discolouration are sought yet network processes remain insufficiently understood to fully justify and optimise capital or operational strategies to manage discolouration risk. Results are presented from a comprehensive fieldwork programme in UK water distribution networks that have determined asset deterioration with respect to discolouration. This is achieved by quantification of material accumulating as cohesive layers on pipe surfaces that when mobilised are acknowledged as the primary cause of discolouration. It is shown that these material layers develop ubiquitously with defined layer strength characteristics and at a consistent and repeatable rate dependant on water quality. For UK networks iron concentration in the bulk water is shown as a potential indicator of deterioration rate. With material layer development rates determined, management decisions that balance discolouration risk and expenditure to maintain water quality integrity can be justified. In particular the balance between capital investment such as improving water treatment output or pipe renewal and operational expenditure such as the frequency of network maintenance through flushing may be judged. While the rate of development is shown to be a function of water quality, the magnitude (peak or average turbidity) of discolouration incidents is shown to be dominated by hydraulic conditions. From this it can be proposed that network hydraulic management, such as regular periodic 'stressing', is a potential strategy in reducing discolouration risk. The ultimate application of this is the hydraulic design of self-cleaning networks to maintain discolouration risk below acceptable levels. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  6. Heterogeneous Wireless Mesh Network Technology Evaluation for Space Proximity and Surface Applications

    NASA Technical Reports Server (NTRS)

    DeCristofaro, Michael A.; Lansdowne, Chatwin A.; Schlesinger, Adam M.

    2014-01-01

    NASA has identified standardized wireless mesh networking as a key technology for future human and robotic space exploration. Wireless mesh networks enable rapid deployment, provide coverage in undeveloped regions. Mesh networks are also self-healing, resilient, and extensible, qualities not found in traditional infrastructure-based networks. Mesh networks can offer lower size, weight, and power (SWaP) than overlapped infrastructure-perapplication. To better understand the maturity, characteristics and capability of the technology, we developed an 802.11 mesh network consisting of a combination of heterogeneous commercial off-the-shelf devices and opensource firmware and software packages. Various streaming applications were operated over the mesh network, including voice and video, and performance measurements were made under different operating scenarios. During the testing several issues with the currently implemented mesh network technology were identified and outlined for future work.

  7. The Network for the Detection of Atmospheric Composition Change (NDACC): history, status and perspectives

    NASA Astrophysics Data System (ADS)

    De Mazière, Martine; Thompson, Anne M.; Kurylo, Michael J.; Wild, Jeannette D.; Bernhard, Germar; Blumenstock, Thomas; Braathen, Geir O.; Hannigan, James W.; Lambert, Jean-Christopher; Leblanc, Thierry; McGee, Thomas J.; Nedoluha, Gerald; Petropavlovskikh, Irina; Seckmeyer, Gunther; Simon, Paul C.; Steinbrecht, Wolfgang; Strahan, Susan E.

    2018-04-01

    The Network for the Detection of Atmospheric Composition Change (NDACC) is an international global network of more than 90 stations making high-quality measurements of atmospheric composition that began official operations in 1991 after 5 years of planning. Apart from sonde measurements, all measurements in the network are performed by ground-based remote-sensing techniques. Originally named the Network for the Detection of Stratospheric Change (NDSC), the name of the network was changed to NDACC in 2005 to better reflect the expanded scope of its measurements. The primary goal of NDACC is to establish long-term databases for detecting changes and trends in the chemical and physical state of the atmosphere (mesosphere, stratosphere, and troposphere) and to assess the coupling of such changes with climate and air quality. NDACC's origins, station locations, organizational structure, and data archiving are described. NDACC is structured around categories of ground-based observational techniques (sonde, lidar, microwave radiometers, Fourier-transform infrared, UV-visible DOAS (differential optical absorption spectroscopy)-type, and Dobson-Brewer spectrometers, as well as spectral UV radiometers), timely cross-cutting themes (ozone, water vapour, measurement strategies, cross-network data integration), satellite measurement systems, and theory and analyses. Participation in NDACC requires compliance with strict measurement and data protocols to ensure that the network data are of high and consistent quality. To widen its scope, NDACC has established formal collaborative agreements with eight other cooperating networks and Global Atmosphere Watch (GAW). A brief history is provided, major accomplishments of NDACC during its first 25 years of operation are reviewed, and a forward-looking perspective is presented.

  8. The Network for the Detection of Atmospheric Composition Change (NDACC): History, Status and Perspectives

    NASA Technical Reports Server (NTRS)

    Simon, Paul C.; De Maziere, Martine; Bernhard, Germar; Blumenstock, Thomas; McGee, Thomas J.; Petropavlovskikh, Irina; Steinbrecht, Wolfgang; Wild, Jeannette D.; Lambert, Jean-Christopher; Seckmeyer, Gunther; hide

    2018-01-01

    The Network for the Detection of Atmospheric Composition Change (NDACC) is an international global network of more than 90 stations making high-quality measurements of atmospheric composition that began official operations in 1991 after 5 years of planning. Apart from sonde measurements, all measurements in the network are performed by ground-based remote-sensing techniques. Originally named the Network for the Detection of Stratospheric Change (NDSC), the name of the network was changed to NDACC in 2005 to better reflect the expanded scope of its measurements. The primary goal of NDACC is to establish long-term databases for detecting changes and trends in the chemical and physical state of the atmosphere (mesosphere, stratosphere, and troposphere) and to assess the coupling of such changes with climate and air quality. NDACC's origins, station locations, organizational structure, and data archiving are described. NDACC is structured around categories of ground-based observational techniques (sonde, lidar, microwave radiometers, Fourier-transform infrared, UV-visible DOAS (differential optical absorption spectroscopy)-type, and Dobson-Brewer spectrometers, as well as spectral UV radiometers), timely cross-cutting themes (ozone, water vapour, measurement strategies, cross-network data integration), satellite measurement systems, and theory and analyses. Participation in NDACC requires compliance with strict measurement and data protocols to ensure that the network data are of high and consistent quality. To widen its scope, NDACC has established formal collaborative agreements with eight other cooperating networks and Global Atmosphere Watch (GAW). A brief history is provided, major accomplishments of NDACC during its first 25 years of operation are reviewed, and a forward-looking perspective is presented.

  9. Quality of surface water in Missouri, water year 2012

    USGS Publications Warehouse

    Barr, Miya N.

    2014-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams and springs throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2012 water year (October 1, 2011, through September 30, 2012), data were collected at 81 stations—73 Ambient Water-Quality Monitoring Network stations, 6 alternate Ambient Water-Quality Monitoring Network stations, and 2 U.S. Geological Survey National Stream Quality Accounting Network stations. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, Escherichia coli bacteria, dissolved nitrate plus nitrite as nitrogen, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 78 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and 7-day low flow is presented.

  10. Quality of surface water in Missouri, water year 2013

    USGS Publications Warehouse

    Barr, Miya N.; Schneider, Rachel E.

    2014-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams and springs throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2013 water year (October 1, 2012, through September 30, 2013), data were collected at 79 stations—73 Ambient Water-Quality Monitoring Network stations, 4 alternate Ambient Water-Quality Monitoring Network stations, and 2 U.S. Geological Survey National Stream Quality Accounting Network stations. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, Escherichia coli bacteria, fecal coliform bacteria, dissolved nitrate plus nitrite as nitrogen, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 76 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and 7-day low flow is presented.

  11. Meta-manager: a requirements analysis.

    PubMed

    Cook, J F; Rozenblit, J W; Chacko, A K; Martinez, R; Timboe, H L

    1999-05-01

    The digital imaging network-picture-archiving and communications system (DIN-PACS) will be implemented in ten sites within the Great Plains Regional Medical Command (GPRMC). This network of PACS and teleradiology technology over a shared T1 network has opened the door for round the clock radiology coverage of all sites. However, the concept of a virtual radiology environment poses new issues for military medicine. A new workflow management system must be developed. This workflow management system will allow us to efficiently resolve these issues including quality of care, availability, severe capitation, and quality of the workforce. The design process of this management system must employ existing technology, operate over various telecommunication networks and protocols, be independent of platform operating systems, be flexible and scaleable, and involve the end user at the outset in the design process for which it is developed. Using the unified modeling language (UML), the specifications for this new business management system were created in concert between the University of Arizona and the GPRMC. These specifications detail a management system operating through a common object request brokered architecture (CORBA) environment. In this presentation, we characterize the Meta-Manager management system including aspects of intelligence, interfacility routing, fail-safe operations, and expected improvements in patient care and efficiency.

  12. Evaluation of Earthquake Detection Performance in Terms of Quality and Speed in SEISCOMP3 Using New Modules Qceval, Npeval and Sceval

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.

    2017-12-01

    The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to events with unclassified Origins, reducing their workload. Classified Origins can be treated specifically by other procedures. These modules have been calibrated and fully tested by several complex seismic monitoring networks in the region of Indonesia and Northern Chile.

  13. Is Your Biobank Up to Standards? A Review of the National Canadian Tissue Repository Network Required Operational Practice Standards and the Controlled Documents of a Certified Biobank.

    PubMed

    Hartman, Victoria; Castillo-Pelayo, Tania; Babinszky, Sindy; Dee, Simon; Leblanc, Jodi; Matzke, Lise; O'Donoghue, Sheila; Carpenter, Jane; Carter, Candace; Rush, Amanda; Byrne, Jennifer; Barnes, Rebecca; Mes-Messons, Anne-Marie; Watson, Peter

    2018-02-01

    Ongoing quality management is an essential part of biobank operations and the creation of high quality biospecimen resources. Adhering to the standards of a national biobanking network is a way to reduce variability between individual biobank processes, resulting in cross biobank compatibility and more consistent support for health researchers. The Canadian Tissue Repository Network (CTRNet) implemented a set of required operational practices (ROPs) in 2011 and these serve as the standards and basis for the CTRNet biobank certification program. A review of these 13 ROPs covering 314 directives was conducted after 5 years to identify areas for revision and update, leading to changes to 7/314 directives (2.3%). A review of all internal controlled documents (including policies, standard operating procedures and guides, and forms for actions and processes) used by the BC Cancer Agency's Tumor Tissue Repository (BCCA-TTR) to conform to these ROPs was then conducted. Changes were made to 20/106 (19%) of BCCA-TTR documents. We conclude that a substantial fraction of internal controlled documents require updates at regular intervals to accommodate changes in best practices. Reviewing documentation is an essential aspect of keeping up to date with best practices and ensuring the quality of biospecimens and data managed by biobanks.

  14. The IRIS DMC: Perspectives on Real-Time Data Management and Open Access From a Large Seismological Archive: Challenges, Tools, and Quality Assurance

    NASA Astrophysics Data System (ADS)

    Benson, R. B.

    2007-05-01

    The IRIS Data Management Center, located in Seattle, WA, is the largest openly accessible geophysical archive in the world, and has a unique perspective on data management and operational practices that gets the most out of your network. Networks scale broad domains in time and space, from finite needs to monitor bridges and dams to national and international networks like the GSN and the FDSN that establish a baseline for global monitoring and research, the requirements that go into creating a well-tuned DMC archive treat these the same, building a collaborative network of networks that generations of users rely on and adds value to the data. Funded by the National Science Foundation through the Division of Earth Sciences, IRIS is operated through member universities and in cooperation with the USGS, and the DMS facility is a bridge between a globally distributed collaboration of seismic networks and an equally distributed network of users that demand a high standard for data quality, completeness, and ease of access. I will describe the role that a perpetual archive has in the life cycle of data, and how hosting real-time data performs a dual role of being a hub for continuous data from approximately 59 real-time networks, and distributing these (along with other data from the 40-year library of available time-series data) to researchers, while simultaneously providing shared data back to networks in real- time that benefits monitoring activities. I will describe aspects of our quality-assurance framework that are both passively and actively performed on 1100 seismic stations, generating over 6,000 channels of regularly sampled data arriving daily, that data providers can use as aids in operating their network, and users can likewise use when requesting suitable data for research purposes. The goal of the DMC is to eliminate bottlenecks in data discovery and shortening the steps leading to analysis. This includes many challenges, including keeping metadata current, tools for evaluating and viewing them, along with measuring and creating databases of other performance metrics and how monitoring them closer to real- time helps reduce operation costs, creates a richer repository, and eliminates problems over generations of duty cycles of data usage. I will describe a new resource, called the Nominal Response Library, which hopes to provide accurate and representative examples of sensor and data logger configurations that are hosted at the DMC and constitute a high-graded subset for crafting your own metadata. Finally, I want to encourage all network operators who do not currently submit SEED format data to an archive to consider these benefits, and briefly discuss how robust transfer mechanisms that include Earthworm, LISS, Antelope, NRTS and SeisComp, to name a few, can assist you in contributing your network data and help create this enabling virtual network of networks. In this era of high performance Internet capacity, the process that enables others to share your data and allows you to utilize external sources of data is nearly seamless with your current mission of network operation.

  15. 34 CFR 412.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... materials; (4) Demonstrates an understanding of the operation of the Vocational Education Curriculum... Network Directors Council described in § 412.4. (b) Plan of operation. (25 points) The Secretary reviews each application to determine the quality of the plan of operation for the project, including— (1) The...

  16. 34 CFR 412.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... materials; (4) Demonstrates an understanding of the operation of the Vocational Education Curriculum... Network Directors Council described in § 412.4. (b) Plan of operation. (25 points) The Secretary reviews each application to determine the quality of the plan of operation for the project, including— (1) The...

  17. 34 CFR 412.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... materials; (4) Demonstrates an understanding of the operation of the Vocational Education Curriculum... Network Directors Council described in § 412.4. (b) Plan of operation. (25 points) The Secretary reviews each application to determine the quality of the plan of operation for the project, including— (1) The...

  18. 34 CFR 412.21 - What selection criteria does the Secretary use?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... materials; (4) Demonstrates an understanding of the operation of the Vocational Education Curriculum... Network Directors Council described in § 412.4. (b) Plan of operation. (25 points) The Secretary reviews each application to determine the quality of the plan of operation for the project, including— (1) The...

  19. External Quality Assurance Programs Managed by the U.S. Geological Survey in Support of the National Atmospheric Deposition Program/Mercury Deposition Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2007-01-01

    The U.S. Geological Survey (USGS) Branch of Quality Systems operates external quality assurance programs for the National Atmospheric Deposition Program/Mercury Deposition Network (NADP/MDN). Beginning in 2004, three programs have been implemented: the system blank program, the interlaboratory comparison program, and the blind audit program. Each program was designed to measure error contributed by specific components in the data-collection process. The system blank program assesses contamination that may result from sampling equipment, field exposure, and routine handling and processing of the wet-deposition samples. The interlaboratory comparison program evaluates bias and precision of analytical results produced by the Mercury Analytical Laboratory (HAL) for the NADP/MDN, operated by Frontier GeoSciences, Inc. The HAL's performance is compared with the performance of five other laboratories. The blind audit program assesses bias and variability of MDN data produced by the HAL using solutions disguised as environmental samples to ascertain true laboratory performance. This report documents the implementation of quality assurance procedures for the NADP/MDN and the operating procedures for each of the external quality assurance programs conducted by the USGS. The USGS quality assurance information provides a measure of confidence to NADP/MDN data users that measurement variability is distinguished from environmental signals.

  20. Hybrid protection algorithms based on game theory in multi-domain optical networks

    NASA Astrophysics Data System (ADS)

    Guo, Lei; Wu, Jingjing; Hou, Weigang; Liu, Yejun; Zhang, Lincong; Li, Hongming

    2011-12-01

    With the network size increasing, the optical backbone is divided into multiple domains and each domain has its own network operator and management policy. At the same time, the failures in optical network may lead to a huge data loss since each wavelength carries a lot of traffic. Therefore, the survivability in multi-domain optical network is very important. However, existing survivable algorithms can achieve only the unilateral optimization for profit of either users or network operators. Then, they cannot well find the double-win optimal solution with considering economic factors for both users and network operators. Thus, in this paper we develop the multi-domain network model with involving multiple Quality of Service (QoS) parameters. After presenting the link evaluation approach based on fuzzy mathematics, we propose the game model to find the optimal solution to maximize the user's utility, the network operator's utility, and the joint utility of user and network operator. Since the problem of finding double-win optimal solution is NP-complete, we propose two new hybrid protection algorithms, Intra-domain Sub-path Protection (ISP) algorithm and Inter-domain End-to-end Protection (IEP) algorithm. In ISP and IEP, the hybrid protection means that the intelligent algorithm based on Bacterial Colony Optimization (BCO) and the heuristic algorithm are used to solve the survivability in intra-domain routing and inter-domain routing, respectively. Simulation results show that ISP and IEP have the similar comprehensive utility. In addition, ISP has better resource utilization efficiency, lower blocking probability, and higher network operator's utility, while IEP has better user's utility.

  1. Economic Feasibility of Wireless Sensor Network-Based Service Provision in a Duopoly Setting with a Monopolist Operator.

    PubMed

    Sanchis-Cano, Angel; Romero, Julián; Sacoto-Cabrera, Erwin J; Guijarro, Luis

    2017-11-25

    We analyze the feasibility of providing Wireless Sensor Network-data-based services in an Internet of Things scenario from an economical point of view. The scenario has two competing service providers with their own private sensor networks, a network operator and final users. The scenario is analyzed as two games using game theory. In the first game, sensors decide to subscribe or not to the network operator to upload the collected sensing-data, based on a utility function related to the mean service time and the price charged by the operator. In the second game, users decide to subscribe or not to the sensor-data-based service of the service providers based on a Logit discrete choice model related to the quality of the data collected and the subscription price. The sinks and users subscription stages are analyzed using population games and discrete choice models, while network operator and service providers pricing stages are analyzed using optimization and Nash equilibrium concepts respectively. The model is shown feasible from an economic point of view for all the actors if there are enough interested final users and opens the possibility of developing more efficient models with different types of services.

  2. Quality Control of The Norwegian Uv Monitoring Network.

    NASA Astrophysics Data System (ADS)

    Johnsen, B.; Mikkelborg, O.; Dahlback, A.; Høiskar, B. A.; Kylling, A.; Edvardsen, K.; Olseth, J. A.; Kjeldstad, B.; Ørbæk, J. B.

    A Norwegian UV-monitoring network of GUV multiband radiometers has been operating at locations between 59°N to 79°N since 1995-96. The purpose of the network is to obtain data of high scientific quality, to be used in further assessments related to health- and environmental issues. Maintenance of measurement quality is given priority. Spectral response functions, crucial for calibrations, have been obtained for each instrument. Calibrations are traceable to the Nordic intercomparison of UV radiometers held in Sweden in June 2000. Instruments are inspected daily or weekly. Once a year the instruments are compared to travelling standards operating side by side to the local network radiometers. This enables determination of the longterm drift in instrument responses. For the six years period of operation, the steadiest instrument performed stable within +/-3%, whereas the least steady had a response drop by 23%. Comparisons with a true cosine performing spectroradiometer demonstrate close agreement (+/- 2%) for solar zenith angles less than 80°. Good cosine performance, high spectral sensitivity and weatherproof design demonstrate that the GUV radiometers are particularly suitable for UV monitoring at high latitudes. Complete records of corrected daily CIE-effective doses and online measurements are presented on http://uvnett.nrpa.no/. Gaps in measurement series have been corrected for with a clear sky radiative transfer model and hourly UV sky transmittances estimated from pyranometer data. Measurement data and information about the monitoring network may be found by visiting websites at respectively NRPA, NILU and The University of Oslo; http://www.nrpa.no, http://www.nilu.no/uv, http://www.fys.uio.no/plasma/ozone/. At this stage the quality of the network has reached a satisfactory level and it is possible to move on using UV data in further assessments. Trend analyses and UV forecasting are topics for future work. The network is supported by the ministries of Health and Environment and is administered by The Norwegian Radiation Protection Authority and The Norwegian Pollution Control Authority, the latter through The Norwegian Institute for Air Research.

  3. System Proposal for Mass Transit Service Quality Control Based on GPS Data

    PubMed Central

    Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R.

    2017-01-01

    Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article. PMID:28621745

  4. System Proposal for Mass Transit Service Quality Control Based on GPS Data.

    PubMed

    Padrón, Gabino; Cristóbal, Teresa; Alayón, Francisco; Quesada-Arencibia, Alexis; García, Carmelo R

    2017-06-16

    Quality is an essential aspect of public transport. In the case of regular public passenger transport by road, punctuality and regularity are criteria used to assess quality of service. Calculating metrics related to these criteria continuously over time and comprehensively across the entire transport network requires the handling of large amounts of data. This article describes a system for continuously and comprehensively monitoring punctuality and regularity. The system uses location data acquired continuously in the vehicles and automatically transferred for analysis. These data are processed intelligently by elements that are commonly used by transport operators: GPS-based tracking system, onboard computer and wireless networks for mobile data communications. The system was tested on a transport company, for which we measured the punctuality of one of the routes that it operates; the results are presented in this article.

  5. Research on information security system of waste terminal disposal process

    NASA Astrophysics Data System (ADS)

    Zhou, Chao; Wang, Ziying; Guo, Jing; Guo, Yajuan; Huang, Wei

    2017-05-01

    Informatization has penetrated the whole process of production and operation of electric power enterprises. It not only improves the level of lean management and quality service, but also faces severe security risks. The internal network terminal is the outermost layer and the most vulnerable node of the inner network boundary. It has the characteristics of wide distribution, long depth and large quantity. The user and operation and maintenance personnel technical level and security awareness is uneven, which led to the internal network terminal is the weakest link in information security. Through the implementation of security of management, technology and physics, we should establish an internal network terminal security protection system, so as to fully protect the internal network terminal information security.

  6. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    NASA Astrophysics Data System (ADS)

    Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem

    2017-11-01

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.

  7. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE PAGES

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...

    2017-10-24

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  8. Hydropower Optimization Using Artificial Neural Network Surrogate Models of a High-Fidelity Hydrodynamics and Water Quality Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.

    Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less

  9. Activities report of PTT Research

    NASA Astrophysics Data System (ADS)

    In the field of postal infrastructure research, activities were performed on postcode readers, radiolabels, and techniques of operations research and artificial intelligence. In the field of telecommunication, transportation, and information, research was made on multipurpose coding schemes, speech recognition, hypertext, a multimedia information server, security of electronic data interchange, document retrieval, improvement of the quality of user interfaces, domotics living support (techniques), and standardization of telecommunication prototcols. In the field of telecommunication infrastructure and provisions research, activities were performed on universal personal telecommunications, advanced broadband network technologies, coherent techniques, measurement of audio quality, near field facilities, local beam communication, local area networks, network security, coupling of broadband and narrowband integrated services digital networks, digital mapping, and standardization of protocols.

  10. Instruction of Computer Supported Collaborative Learning Environment and Students' Contribution Quality

    ERIC Educational Resources Information Center

    Akgün, Ergün; Akkoyunlu, Buket

    2013-01-01

    Along with the integration of network and communication innovations into education, those technology enriched learning environments gained importance both qualitatively and operationally. Using network and communication innovations in the education field, provides diffusion of information and global accessibility, and also allows physically…

  11. Mission-Centered Network Models: Defending Mission-Critical Tasks From Deception

    DTIC Science & Technology

    2015-09-29

    celebrities ). In military applications, networked operations offer an effective way to reduce the footprint of a force, but become a center of gravity...from,-used-by-trust-algorithms-to-assess-quality-and- trustworthiness - •  Technical&challenge:-Developing-standard-representa3ons-for-provenance-that

  12. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  13. Neural networks with fuzzy Petri nets for modeling a machining process

    NASA Astrophysics Data System (ADS)

    Hanna, Moheb M.

    1998-03-01

    The paper presents an intelligent architecture based a feedforward neural network with fuzzy Petri nets for modeling product quality in a CNC machining center. It discusses how the proposed architecture can be used for modeling, monitoring and control a product quality specification such as surface roughness. The surface roughness represents the output quality specification manufactured by a CNC machining center as a result of a milling process. The neural network approach employed the selected input parameters which defined by the machine operator via the CNC code. The fuzzy Petri nets approach utilized the exact input milling parameters, such as spindle speed, feed rate, tool diameter and coolant (off/on), which can be obtained via the machine or sensors system. An aim of the proposed architecture is to model the demanded quality of surface roughness as high, medium or low.

  14. An ILP based memetic algorithm for finding minimum positive influence dominating sets in social networks

    NASA Astrophysics Data System (ADS)

    Lin, Geng; Guan, Jian; Feng, Huibin

    2018-06-01

    The positive influence dominating set problem is a variant of the minimum dominating set problem, and has lots of applications in social networks. It is NP-hard, and receives more and more attention. Various methods have been proposed to solve the positive influence dominating set problem. However, most of the existing work focused on greedy algorithms, and the solution quality needs to be improved. In this paper, we formulate the minimum positive influence dominating set problem as an integer linear programming (ILP), and propose an ILP based memetic algorithm (ILPMA) for solving the problem. The ILPMA integrates a greedy randomized adaptive construction procedure, a crossover operator, a repair operator, and a tabu search procedure. The performance of ILPMA is validated on nine real-world social networks with nodes up to 36,692. The results show that ILPMA significantly improves the solution quality, and is robust.

  15. Performance of several low-cost accelerometers

    USGS Publications Warehouse

    Evans, J.R.; Allen, R.M.; Chung, A. I.; Cochran, E.S.; Guy, R.; Hellweg, M.; Lawrence, J. F.

    2014-01-01

    Several groups are implementing low‐cost host‐operated systems of strong‐motion accelerographs to support the somewhat divergent needs of seismologists and earthquake engineers. The Advanced National Seismic System Technical Implementation Committee (ANSS TIC, 2002), managed by the U.S. Geological Survey (USGS) in cooperation with other network operators, is exploring the efficacy of such systems if used in ANSS networks. To this end, ANSS convened a working group to explore available Class C strong‐motion accelerometers (defined later), and to consider operational and quality control issues, and the means of annotating, storing, and using such data in ANSS networks. The working group members are largely coincident with our author list, and this report informs instrument‐performance matters in the working group’s report to ANSS. Present examples of operational networks of such devices are the Community Seismic Network (CSN; csn.caltech.edu), operated by the California Institute of Technology, and Quake‐Catcher Network (QCN; Cochran et al., 2009; qcn.stanford.edu; November 2013), jointly operated by Stanford University and the USGS. Several similar efforts are in development at other institutions. The overarching goals of such efforts are to add spatial density to existing Class‐A and Class‐B (see next paragraph) networks at low cost, and to include many additional people so they become invested in the issues of earthquakes, their measurement, and the damage they cause.

  16. Structural and functional social network attributes moderate the association of self-rated health with mental health in midlife and older adults.

    PubMed

    Windsor, Tim D; Rioseco, Pilar; Fiori, Katherine L; Curtis, Rachel G; Booth, Heather

    2016-01-01

    Social relationships are multifaceted, and different social network components can operate via different processes to influence well-being. This study examined associations of social network structure and relationship quality (positive and negative social exchanges) with mental health in midlife and older adults. The focus was on both direct associations of network structure and relationship quality with mental health, and whether these social network attributes moderated the association of self-rated health (SRH) with mental health. Analyses were based on survey data provided by 2001 (Mean age = 65, SD = 8.07) midlife and older adults. We used Latent Class Analysis (LCA) to classify participants into network types based on network structure (partner status, network size, contact frequency, and activity engagement), and used continuous measures of positive and negative social exchanges to operationalize relationship quality. Regression analysis was used to test moderation. LCA revealed network types generally consistent with those reported in previous studies. Participants in more diverse networks reported better mental health than those categorized into a restricted network type after adjustment for age, sex, education, and employment status. Analysis of moderation indicated that those with poorer SRH were less likely to report poorer mental health if they were classified into more diverse networks. A similar moderation effect was also evident for positive exchanges. The findings suggest that both quantity and quality of social relationships can play a role in buffering against the negative implications of physical health decline for mental health.

  17. A comprehensive method for GNSS data quality determination to improve ionospheric data analysis.

    PubMed

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-08-14

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis.

  18. A Comprehensive Method for GNSS Data Quality Determination to Improve Ionospheric Data Analysis

    PubMed Central

    Kim, Minchan; Seo, Jiwon; Lee, Jiyun

    2014-01-01

    Global Navigation Satellite Systems (GNSS) are now recognized as cost-effective tools for ionospheric studies by providing the global coverage through worldwide networks of GNSS stations. While GNSS networks continue to expand to improve the observability of the ionosphere, the amount of poor quality GNSS observation data is also increasing and the use of poor-quality GNSS data degrades the accuracy of ionospheric measurements. This paper develops a comprehensive method to determine the quality of GNSS observations for the purpose of ionospheric studies. The algorithms are designed especially to compute key GNSS data quality parameters which affect the quality of ionospheric product. The quality of data collected from the Continuously Operating Reference Stations (CORS) network in the conterminous United States (CONUS) is analyzed. The resulting quality varies widely, depending on each station and the data quality of individual stations persists for an extended time period. When compared to conventional methods, the quality parameters obtained from the proposed method have a stronger correlation with the quality of ionospheric data. The results suggest that a set of data quality parameters when used in combination can effectively select stations with high-quality GNSS data and improve the performance of ionospheric data analysis. PMID:25196005

  19. Space Network Ground Segment Sustainment (SGSS) Project: Developing a COTS-Intensive Ground System

    NASA Technical Reports Server (NTRS)

    Saylor, Richard; Esker, Linda; Herman, Frank; Jacobsohn, Jeremy; Saylor, Rick; Hoffman, Constance

    2013-01-01

    Purpose of the Space Network Ground Segment Sustainment (SGSS) is to implement a new modern ground segment that will enable the NASA Space Network (SN) to deliver high quality services to the SN community for the future The key SGSS Goals: (1) Re-engineer the SN ground segment (2) Enable cost efficiencies in the operability and maintainability of the broader SN.

  20. Water-resources investigations in Tennessee; programs and activities of the U.S. Geological Survey, 1987-1988

    USGS Publications Warehouse

    Quinones, Ferdinand; Balthrop, B.H.; Baker, E.G.

    1988-01-01

    This report contains a summation of 44 projects which were active in the Tennessee District during 1987 and 1988. Given in each summary is the name of the project chief, the objective of the project, the progress or results of the study to date, and the name of the cooperator. Hydrologic data are the backbone of the investigations conducted by the U.S Geological Survey (USGS). The basic data programs conducted by the Tennessee District provide streamflow, quality of water, and groundwater levels information essential to the assessment and management of the State 's water resources. Long-term streamflow, quality of water, and groundwater levels network are operated as part of the Hydrologic Data Section. Field operations are about equally divided among field offices in Memphis, Nashville, and Knoxville. A staff of about 40 engineers, hydrologists, and hydrologic technicians labor in the operation of the long-term network as well as short-term efforts in support of areal investigations. The data collected as part of the networks are published in the series of annual data reports. (USGS)

  1. Integrated Neural Flight and Propulsion Control System

    NASA Technical Reports Server (NTRS)

    Kaneshige, John; Gundy-Burlet, Karen; Norvig, Peter (Technical Monitor)

    2001-01-01

    This paper describes an integrated neural flight and propulsion control system. which uses a neural network based approach for applying alternate sources of control power in the presence of damage or failures. Under normal operating conditions, the system utilizes conventional flight control surfaces. Neural networks are used to provide consistent handling qualities across flight conditions and for different aircraft configurations. Under damage or failure conditions, the system may utilize unconventional flight control surface allocations, along with integrated propulsion control, when additional control power is necessary for achieving desired flight control performance. In this case, neural networks are used to adapt to changes in aircraft dynamics and control allocation schemes. Of significant importance here is the fact that this system can operate without emergency or backup flight control mode operations. An additional advantage is that this system can utilize, but does not require, fault detection and isolation information or explicit parameter identification. Piloted simulation studies were performed on a commercial transport aircraft simulator. Subjects included both NASA test pilots and commercial airline crews. Results demonstrate the potential for improving handing qualities and significantly increasing survivability rates under various simulated failure conditions.

  2. Evaluation of coffee roasting degree by using electronic nose and artificial neural network for off-line quality control.

    PubMed

    Romani, Santina; Cevoli, Chiara; Fabbri, Angelo; Alessandrini, Laura; Dalla Rosa, Marco

    2012-09-01

    An electronic nose (EN) based on an array of 10 metal oxide semiconductor sensors was used, jointly with an artificial neural network (ANN), to predict coffee roasting degree. The flavor release evolution and the main physicochemical modifications (weight loss, density, moisture content, and surface color: L*, a*), during the roasting process of coffee, were monitored at different cooking times (0, 6, 8, 10, 14, 19 min). Principal component analysis (PCA) was used to reduce the dimensionality of sensors data set (600 values per sensor). The selected PCs were used as ANN input variables. Two types of ANN methods (multilayer perceptron [MLP] and general regression neural network [GRNN]) were used in order to estimate the EN signals. For both neural networks the input values were represented by scores of sensors data set PCs, while the output values were the quality parameter at different roasting times. Both the ANNs were able to well predict coffee roasting degree, giving good prediction results for both roasting time and coffee quality parameters. In particular, GRNN showed the highest prediction reliability. Actually the evaluation of coffee roasting degree is mainly a manned operation, substantially based on the empirical final color observation. For this reason it requires well-trained operators with a long professional skill. The coupling of e-nose and artificial neural networks (ANNs) may represent an effective possibility to roasting process automation and to set up a more reproducible procedure for final coffee bean quality characterization. © 2012 Institute of Food Technologists®

  3. Use of Whatman-41 filters in air quality sampling networks (with applications to elemental analysis)

    NASA Technical Reports Server (NTRS)

    Neustadter, H. E.; Sidik, S. M.; King, R. B.; Fordyce, J. S.; Burr, J. C.

    1974-01-01

    The operation of a 16-site parallel high volume air sampling network with glass fiber filters on one unit and Whatman-41 filters on the other is reported. The network data and data from several other experiments indicate that (1) Sampler-to-sampler and filter-to-filter variabilities are small; (2) hygroscopic affinity of Whatman-41 filters need not introduce errors; and (3) suspended particulate samples from glass fiber filters averaged slightly, but not statistically significantly, higher than from Whatman-41-filters. The results obtained demonstrate the practicability of Whatman-41 filters for air quality monitoring and elemental analysis.

  4. Economic Feasibility of Wireless Sensor Network-Based Service Provision in a Duopoly Setting with a Monopolist Operator

    PubMed Central

    Romero, Julián; Sacoto-Cabrera, Erwin J.

    2017-01-01

    We analyze the feasibility of providing Wireless Sensor Network-data-based services in an Internet of Things scenario from an economical point of view. The scenario has two competing service providers with their own private sensor networks, a network operator and final users. The scenario is analyzed as two games using game theory. In the first game, sensors decide to subscribe or not to the network operator to upload the collected sensing-data, based on a utility function related to the mean service time and the price charged by the operator. In the second game, users decide to subscribe or not to the sensor-data-based service of the service providers based on a Logit discrete choice model related to the quality of the data collected and the subscription price. The sinks and users subscription stages are analyzed using population games and discrete choice models, while network operator and service providers pricing stages are analyzed using optimization and Nash equilibrium concepts respectively. The model is shown feasible from an economic point of view for all the actors if there are enough interested final users and opens the possibility of developing more efficient models with different types of services. PMID:29186847

  5. The Chinese Cardiac Surgery Registry: Design and Data Audit.

    PubMed

    Rao, Chenfei; Zhang, Heng; Gao, Huawei; Zhao, Yan; Yuan, Xin; Hua, Kun; Hu, Shengshou; Zheng, Zhe

    2016-04-01

    In light of the burgeoning volume and certain variation of in-hospital outcomes of cardiac operations in China, a large patient-level registry was needed. We generated the Chinese Cardiac Surgery Registry (CCSR) database in 2013 to benchmark, continuously monitor, and provide feedback of the quality of adult cardiac operations. We report on the design of this database and provide an overview of participating sites and quality of data. We established a network of participating sites with an adult cardiac surgery volume of more than 100 operations per year for continuous web-based registry of in-hospital and follow-up data of coronary artery bypass grafting (CABG) and valve operations. After a routine data quality audit, we report the performance and quality of care back to the participating sites. In total, 87 centers participated and submitted 46,303 surgical procedures from January 2013 to December 2014. The timeliness rates of the short-list and in-hospital data submitted were 73.6% and 70.2%, respectively. The completeness and accuracy rates of the in-hospital data were 97.6% and 95.1%, respectively. We have provided 2 reports for each site and 1 national report regarding the performance of isolated CABG and valve operations. The newly launched CCSR with a national representativeness network and good data quality has the potential to act as an important platform for monitoring and improving cardiac surgical care in mainland China, as well as facilitating research projects, establishing benchmarking standards, and identifying potential areas for quality improvements (ClinicalTrials.gov No. NCT02400125). Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  6. GLOBECOM '85 - Global Telecommunications Conference, New Orleans, LA, December 2-5, 1985, Conference Record. Volumes 1, 2, & 3

    NASA Astrophysics Data System (ADS)

    Various papers on global telecommunications are presented. The general topics addressed include: multiservice integration with optical fibers, multicompany owned telecommunication networks, softworks quality and reliability, advanced on-board processing, impact of new services and systems on operations and maintenance, analytical studies of protocols for data communication networks, topics in packet radio networking, CCITT No. 7 to support new services, document processing and communication, antenna technology and system aspects in satellite communications. Also considered are: communication systems modelling methodology, experimental integrated local area voice/data nets, spread spectrum communications, motion video at the DS-0 rate, optical and data communications, intelligent work stations, switch performance analysis, novel radio communication systems, wireless local networks, ISDN services, LAN communication protocols, user-system interface, radio propagation and performance, mobile satellite system, software for computer networks, VLSI for ISDN terminals, quality management, man-machine interfaces in switching, and local area network performance.

  7. 78 FR 54173 - Approval and Promulgation of Air Quality Implementation Plans; Indiana; Maintenance Plan Update...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... direct final rule in the Federal Register informing the public that the rule will not take effect... maintenance period, a commitment to maintain the existing monitoring network, factors and procedures to be... Network Indiana currently operates two SO 2 monitors in Lake County, Indiana. Indiana has committed to...

  8. ISDN Application in the Army Environment

    DTIC Science & Technology

    1992-02-01

    Signalling System Number 7 ( SS7 ). SS7 is a packet switched signalling network operating in parallel with the traffic bearing network. The current, in...for example, require SS7 . Further into the future, broadband ISDN (B-ISDN) is expected to provide high-quality, full-motion video, High Definition...smaller business offices, ISDN could be a viable alternative to private networks, especially when switches are connected through SS7 . ISDN, in combination

  9. An Optimization Approach to Coexistence of Bluetooth and Wi-Fi Networks Operating in ISM Environment

    NASA Astrophysics Data System (ADS)

    Klajbor, Tomasz; Rak, Jacek; Wozniak, Jozef

    Unlicensed ISM band is used by various wireless technologies. Therefore, issues related to ensuring the required efficiency and quality of operation of coexisting networks become essential. The paper addresses the problem of mutual interferences between IEEE 802.11b transmitters (commercially named Wi-Fi) and Bluetooth (BT) devices.An optimization approach to modeling the topology of BT scatternets is introduced, resulting in more efficient utilization of ISM environment consisting of BT and Wi-Fi networks. To achieve it, the Integer Linear Programming approach has been proposed. Example results presented in the paper illustrate significant benefits of using the proposed modeling strategy.

  10. Water Quality in Small Community Distribution Systems. A Reference Guide for Operators

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has developed this reference guide to assist the operators and managers of small- and medium-sized public water systems. This compilation provides a comprehensive picture of the impact of the water distribution system network on dist...

  11. [Strategic thinking of the construction of national schistosomiasis laboratory network in China].

    PubMed

    Qin, Zhi-Qiang; Xu, Jing; Feng, Ting; Zhu, Hong-Qing; Li, Shi-Zhu; Xiao, Ning; Zhou, Xiao-Nong

    2013-08-01

    A schistosomiasis laboratory network and its quality assurance system have been built and will be more and more perfect in China. This paper introduces the present situation of schistosomiasis diagnosis in China and expounds the basic ideas and the progress in the construction of schistosomiasis network platform. Furthermore, the face of schistosomiasis diagnosis network platform construction and operation of the challenge and the future work will be put forward in the latter part of this paper.

  12. Maintaining High Quality Network Performance at the GSN: Sensor Installation Methods, New VBB Borehole Sensors and Data Quality Assessment from MUSTANG

    NASA Astrophysics Data System (ADS)

    Hafner, Katrin

    2017-04-01

    The goal of the Global Seismographic Network (GSN) is to provide the highest possible data quality and dynamic recording range in support of scientific needs. Considerable effort is made at each GSN seismic station site to achieve the lowest noise performance possible under local conditions. We continue to strive for higher data quality with a combination of new sensors and improved installation techniques. Most seismometers are installed either in 100 m deep steel-cased boreholes or in vaults tunneled underground. A few vaults are built at the surface or on the foundation of a building. All vault installations have a concrete pier, mechanically isolated from the floor, upon which the seismometers are placed. Many sites are now nearly 30 years old, and the GSN is investing in civil works at several stations to keep them in good condition or make critical repairs. Using GSN data from inception to the present, we will present analyses that demonstrate how successful these sensor installation strategies have been and describe ongoing experiments at GSN testing facilities to evaluate the best, most cost effective strategy to modernize existing GSN facilities. To improve sensor performance at some vault sites, we will employ new sensor installation strategies. Years of experience operating the GSN and the USArray Transportable Array, along with focused testing of emplacement strategies, show that the vulnerability of a sensor's horizontal components to tilt can be mitigated if the sensor package is buried at even shallow depth. At selected vault installations, shallow boreholes will be drilled to accommodate recently developed borehole VBB sensor models. The incremental cost of modern VBB instruments over standard BB models is small, and we expect to be able to preserve the GSN's crucial very broad bandwidth while improving noise performance and reliability using this strategy. A crucial link in making GSN station data available to the scientific community is the IRIS Data Management Center, which not only maintains the data archive, but also provides easy, rapid, and open access to data recorded from seconds to decades ago. All data flow to the IRIS DMC through the UCSD or ASL Data Collection Centers (DCCs). The DCCs focus on delivering data to the DMC, maintaining correct metadata for GSN stations, reviewing data quality from the stations that ASL and UCSD operate, and addressing circumstances that require special data handling, such as back filling following telemetry outages. Key to the high quality of the GSN data is the direct feedback on data quality problems identified by the DCC analysts to the network operations staff and field engineers. Aging of GSN equipment and station infrastructure has resulted in renewed emphasis on using data quality control tools such as MUSTANG. These tools allow the network operators to routinely monitor and analyze waveform data to detect and track problems and develop short and longer term action plans for improving network data quality. We will present summary data quality metrics for the GSN as obtained via these quality assurance tools.

  13. A water-resources data-network evaluation for Monterey County, California; Phase 3, Northern Salinas River drainage basin

    USGS Publications Warehouse

    Templin, W.E.; Schluter, R.C.

    1990-01-01

    This report evaluates existing data collection networks and possible additional data collection to monitor quantity and quality of precipitation, surface water, and groundwater in the northern Salinas River drainage basin, California. Of the 34 precipitation stations identified, 20 were active and are concentrated in the northwestern part of the study area. No precipitation quality networks were identified, but possible data collection efforts include monitoring for acid rain and pesticides. Six of ten stream-gaging stations are active. Two surface water quality sites are sampled for suspended sediment, specific conductance, and chloride; one U.S. Geological Survey NASOAN site and one site operated by California Department of Water Resources make up the four active sampling locations; reactivation of 45 inactive surface water quality sites might help to achieve objectives described in the report. Three local networks measure water levels in 318 wells monthly, during peak irrigation, and at the end of the irrigation season. Water quality conditions are monitored in 379 wells; samples are collected in summer to monitor saltwater intrusion near Castroville and are also collected annually throughout the study area for analysis of chloride, specific conductance, and nitrate. An ideal baseline network would be an evenly spaced grid of index wells with a density of one per section. When baseline conditions are established, representative wells within the network could be monitored periodically according to specific data needs. (USGS)

  14. Cognitive Networks

    DTIC Science & Technology

    2007-06-15

    13 2.1.3 Quality of Service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2 Related Efforts...objectives such as resource management, Quality of Service (QoS), security, or access control. The limitations of CN applications should come from the...achieving the best mode of operation in an SDR. 2.1.3 Quality of Service There has been a lot of research on how to define a QoS architecture for the

  15. A managed clinical network for cardiac services: set-up, operation and impact on patient care.

    PubMed

    Stc Hamilton, Karen E; Sullivan, Frank M; Donnan, Peter T; Taylor, Rex; Ikenwilo, Divine; Scott, Anthony; Baker, Chris; Wyke, Sally

    2005-01-01

    To investigate the set up and operation of a Managed Clinical Network for cardiac services and assess its impact on patient care. This single case study used process evaluation with observational before and after comparison of indicators of quality of care and costs. The study was conducted in Dumfries and Galloway, Scotland and used a three-level framework. Process evaluation of the network set-up and operation through a documentary review of minutes; guidelines and protocols; transcripts of fourteen semi-structured interviews with health service personnel including senior managers, general practitioners, nurses, cardiologists and members of the public. Outcome evaluation of the impact of the network through interrupted time series analysis of clinical data of 202 patients aged less than 76 years admitted to hospital with a confirmed myocardial infarction one-year pre and one-year post, the establishment of the network. The main outcome measures were differences between indicators of quality of care targeted by network protocols. Economic evaluation of the transaction costs of the set-up and operation of the network and the resource costs of the clinical care of the 202 myocardial infarction patients from the time of hospital admission to 6 months post discharge through interrupted time series analysis. The outcome measure was different in National Health Service resource use. Despite early difficulties, the network was successful in bringing together clinicians, patients and managers to redesign services, exhibiting most features of good network management. The role of the energetic lead clinician was crucial, but the network took time to develop and 'bed down'. Its primary "modus operand" was the development of a myocardial infarction pathway and associated protocols. Of sixteen clinical care indicators, two improved significantly following the launch of the network and nine showed improvements, which were not statistically significant. There was no difference in resource use. The Managed Clinical Network made a difference to ways of working, particularly in breaching traditional boundaries and involving the public, and made modest changes in patient care. However, it required a two-year "set-up" period. Managed clinical networks are complex initiatives with an increasing profile in health care policy. This study suggests that they require energetic leadership and improvements are likely to be slow and incremental.

  16. A managed clinical network for cardiac services: set-up, operation and impact on patient care

    PubMed Central

    E StC Hamilton, Karen; M Sullivan, Frank; T Donnan, Peter; Taylor, Rex; Ikenwilo, Divine; Scott, Anthony; Baker, Chris; Wyke, Sally

    2005-01-01

    Abstract Purpose To investigate the set up and operation of a Managed Clinical Network for cardiac services and assess its impact on patient care. Methods This single case study used process evaluation with observational before and after comparison of indicators of quality of care and costs. The study was conducted in Dumfries and Galloway, Scotland and used a three-level framework. Process evaluation of the network set-up and operation through a documentary review of minutes; guidelines and protocols; transcripts of fourteen semi-structured interviews with health service personnel including senior managers, general practitioners, nurses, cardiologists and members of the public. Outcome evaluation of the impact of the network through interrupted time series analysis of clinical data of 202 patients aged less than 76 years admitted to hospital with a confirmed myocardial infarction one-year pre and one-year post, the establishment of the network. The main outcome measures were differences between indicators of quality of care targeted by network protocols. Economic evaluation of the transaction costs of the set-up and operation of the network and the resource costs of the clinical care of the 202 myocardial infarction patients from the time of hospital admission to 6 months post discharge through interrupted time series analysis. The outcome measure was different in National Health Service resource use. Results Despite early difficulties, the network was successful in bringing together clinicians, patients and managers to redesign services, exhibiting most features of good network management. The role of the energetic lead clinician was crucial, but the network took time to develop and ‘bed down’. Its primary “modus operand” was the development of a myocardial infarction pathway and associated protocols. Of sixteen clinical care indicators, two improved significantly following the launch of the network and nine showed improvements, which were not statistically significant. There was no difference in resource use. Discussion and conclusions The Managed Clinical Network made a difference to ways of working, particularly in breaching traditional boundaries and involving the public, and made modest changes in patient care. However, it required a two-year “set-up” period. Managed clinical networks are complex initiatives with an increasing profile in health care policy. This study suggests that they require energetic leadership and improvements are likely to be slow and incremental. PMID:16773161

  17. Developing a space network interface simulator: The NTS approach

    NASA Technical Reports Server (NTRS)

    Hendrzak, Gary E.

    1993-01-01

    This paper describes the approach used to redevelop the Network Control Center (NCC) Test System (NTS), a hardware and software facility designed to make testing of the NCC Data System (NCCDS) software efficient, effective, and as rigorous as possible prior to operational use. The NTS transmits and receives network message traffic in real-time. Data transfer rates and message content are strictly controlled and are identical to that of the operational systems. NTS minimizes the need for costly and time-consuming testing with the actual external entities (e.g., the Hubble Space Telescope (HST) Payload Operations Control Center (POCC) and the White Sands Ground Terminal). Discussed are activities associated with the development of the NTS, lessons learned throughout the project's lifecycle, and resulting productivity and quality increases.

  18. Surface-water data and statistics from U.S. Geological Survey data-collection networks in New Jersey on the World Wide Web

    USGS Publications Warehouse

    Reiser, Robert G.; Watson, Kara M.; Chang, Ming; Nieswand, Steven P.

    2002-01-01

    The U.S. Geological Survey (USGS), in cooperation with other Federal, State, and local agencies, operates and maintains a variety of surface-water data-collection networks throughout the State of New Jersey. The networks include streamflow-gaging stations, low-flow sites, crest-stage gages, tide gages, tidal creststage gages, and water-quality sampling sites. Both real-time and historical surface-water data for many of the sites in these networks are available at the USGS, New Jersey District, web site (http://nj.usgs.gov/), and water-quality data are available at the USGS National Water Information System (NWIS) web site (http://waterdata.usgs.gov/nwis/). These data are an important source of information for water managers, engineers, environmentalists, and private citizens.

  19. Multi-criteria anomaly detection in urban noise sensor networks.

    PubMed

    Dauwe, Samuel; Oldoni, Damiano; De Baets, Bernard; Van Renterghem, Timothy; Botteldooren, Dick; Dhoedt, Bart

    2014-01-01

    The growing concern of citizens about the quality of their living environment and the emergence of low-cost microphones and data acquisition systems triggered the deployment of numerous noise monitoring networks spread over large geographical areas. Due to the local character of noise pollution in an urban environment, a dense measurement network is needed in order to accurately assess the spatial and temporal variations. The use of consumer grade microphones in this context appears to be very cost-efficient compared to the use of measurement microphones. However, the lower reliability of these sensing units requires a strong quality control of the measured data. To automatically validate sensor (microphone) data, prior to their use in further processing, a multi-criteria measurement quality assessment model for detecting anomalies such as microphone breakdowns, drifts and critical outliers was developed. Each of the criteria results in a quality score between 0 and 1. An ordered weighted average (OWA) operator combines these individual scores into a global quality score. The model is validated on datasets acquired from a real-world, extensive noise monitoring network consisting of more than 50 microphones. Over a period of more than a year, the proposed approach successfully detected several microphone faults and anomalies.

  20. Network Compression as a Quality Measure for Protein Interaction Networks

    PubMed Central

    Royer, Loic; Reimann, Matthias; Stewart, A. Francis; Schroeder, Michael

    2012-01-01

    With the advent of large-scale protein interaction studies, there is much debate about data quality. Can different noise levels in the measurements be assessed by analyzing network structure? Because proteomic regulation is inherently co-operative, modular and redundant, it is inherently compressible when represented as a network. Here we propose that network compression can be used to compare false positive and false negative noise levels in protein interaction networks. We validate this hypothesis by first confirming the detrimental effect of false positives and false negatives. Second, we show that gold standard networks are more compressible. Third, we show that compressibility correlates with co-expression, co-localization, and shared function. Fourth, we also observe correlation with better protein tagging methods, physiological expression in contrast to over-expression of tagged proteins, and smart pooling approaches for yeast two-hybrid screens. Overall, this new measure is a proxy for both sensitivity and specificity and gives complementary information to standard measures such as average degree and clustering coefficients. PMID:22719828

  1. Concentrations and annual fluxes for selected water-quality constituents from the USGS National Stream Quality Accounting Network (NASQAN) 1996-2000

    USGS Publications Warehouse

    Kelly, Valerie J.; Hooper, Richard P.; Aulenbach, Brent T.; Janet, Mary

    2001-01-01

    This report contains concentrations and annual mass fluxes (loadings) for a broad range of water-quality constituents measured during 1996-2000 as part of the U.S. Geological Survey National Stream Quality Accounting Network (NASQAN). During this period, NASQAN operated a network of 40-42 stations in four of the largest river basins of the USA: the Colorado, the Columbia, the Mississippi (including the Missouri and Ohio), and the Rio Grande. The report contains surface-water quality data, streamflow data, field measurements (e.g. water temperature and pH), sediment-chemistry data, and quality-assurance data; interpretive products include annual and average loads, regression parameters for models used to estimate loads, sub-basin yield maps, maps depicting percent detections for censored constituents, and diagrams depicting flow-weighted average concentrations. Where possible, a regression model relating concentration to discharge and season was used for flux estimation. The interpretive context provided by annual loads includes identifying source and sink areas for constituents and estimating the loadings to receiving waters, such as reservoirs or the ocean.

  2. Lessons Learned and Lessons To Be Learned: An Overview of Innovative Network Learning Environments.

    ERIC Educational Resources Information Center

    Jacobson, Michael J.; Jacobson, Phoebe Chen

    This paper provides an overview of five innovative projects involving network learning technologies in the United States: (1) the MicroObservatory Internet Telescope is a collection of small, high-quality, and low-maintenance telescopes operated by the Harvard-Smithsonian Center for Astrophysics (Massachusetts), which may be used remotely via the…

  3. Quality control in mutation analysis: the European Molecular Genetics Quality Network (EMQN).

    PubMed

    Müller, C R

    2001-08-01

    The demand for clinical molecular genetics testing has steadily grown since its introduction in the 1980s. In order to reach and maintain the agreed quality standards of laboratory medicine, the same internal and external quality assurance (IQA/EQA) criteria have to be applied as for "conventional" clinical chemistry or pathology. In 1996 the European Molecular Genetics Quality Network (EMQN) was established in order to spread QA standards across Europe and to harmonise the existing national activities. EMQN is operated by a central co-ordinator and 17 national partners from 15 EU countries; since 1998 it is being funded by the EU commission for a 3-year period. EMQN promotes QA by two tools: by providing disease-specific best practice meetings (BPM) and EQA schemes. A typical BPM is focussed on one disease or group of related disorders. International experts report on the latest news of gene characterisation and function and the state-of-the-art techniques for mutation detection. Disease-specific EQA schemes are provided by experts in the field. DNA samples are sent out together with mock clinical referrals and a diagnostic question is asked. Written reports must be returned which are marked for genotyping and interpretation. So far, three BPMs have been held and six EQA schemes are in operation at various stages. Although mutation types and diagnostic techniques varied considerably between schemes, the overall technical performance showed a high diagnostic standard. Nevertheless, serious genotyping errors have been occurred in some schemes which underline the necessity of quality assurance efforts. The European Molecular Genetics Quality Network provides a necessary platform for the internal and external quality assurance of molecular genetic testing.

  4. U.S. Geological Survey Streamgage Operation and Maintenance Cost Evaluation...from the National Streamflow Information Program

    USGS Publications Warehouse

    Norris, J. Michael

    2010-01-01

    To help meet the goal of providing earth-science information to the Nation, the U.S. Geological Survey (USGS) operates and maintains the largest streamgage network in the world, with over 7,600 active streamgages in 2010. This network is operated in cooperation with over 850 Federal, tribal, State, and local funding partners. The streamflow information provided by the USGS is used for the protection of life and property; for the assessment, allocation, and management of water resources; for the design of roads, bridges, dams, and water works; for the delineation of flood plains; for the assessment and evaluation of habitat; for understanding the effects of land-use, water-use, and climate changes; for evaluation of water quality; and for recreational safety and enjoyment. USGS streamgages are managed and operated to rigorous national standards, allowing analyses of data from streamgages in different areas and spanning long time periods, some with more than 100 years of data. About 90 percent of USGS streamgages provide streamflow information real-time on the web. Physical measurements of streamflow are made at streamgages multiple times a year, depending on flow conditions, to ensure the highest level of accuracy possible. In addition, multiple reviews and quality assurance checks are performed before the data is finalized. In 2006, the USGS reviewed all activities, operations, equipment, support, and costs associated with operating and maintaining a streamgage program (Norris and others, 2008). A summary of the percentages of costs associated with activities required to operate a streamgage on an annual basis are presented in figure 1. This information represents what it costs to fund a 'typical' USGS streamgage and how those funds are utilized. It should be noted that some USGS streamgages have higher percentages for some categories than do others depending on location and conditions. Forty-one percent of the funding for the typical USGS streamgage is for labor costs of the USGS staff responsible for the measurement of the streamflow in the field and the time in the office to quality assure and finalize the data. It is reasonable that funding for the entire national streamgage network would closely follow the percentages shown in figure 1 as to how the funds are invested in the network. However, actual costs are specific to a particular streamgage and can vary substantially depending on location and operational issues.

  5. Ambient air monitoring plan for Ciudad Acuna and Piedra Negras, Coahuila, Mexico. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winberry, J.; Henning, L.; Crume, R.

    1998-01-01

    The Cities of Ciudad Acuna and Piedras Negras and the State of Coahuila in Mexico are interested in improving ambient air quality monitoring capabilities in the two cities through the establishment of a network of ambient air monitors. The purpose of the network is to characterize population exposure to potentially harmful air contaminants, possibly including sulfur dioxide (SO{sub 2}), nitrogen oxides (NO{sub x}), ozone (O{sub 3}), carbon monoxide (CO), total suspended particulate matter (TSP), particulate matter with aerodynamic diameter less than 100 micrometers PM-10, and lead. This report presents the results of an evaluation of existing air quality monitoring equipmentmore » and facilities in Ciudad Acuna and Piedras Negras. Additionally, the report presents recommendations for developing an air quality monitoring network for PM-10, SO{sub 2}, lead, and ozone in these cities, using a combination of both new and existing equipment. The human resources currently available and ultimately needed to operate and maintain the network are also discussed.« less

  6. Learnings from the Monitoring of Induced Seismicity in Western Canada over the Past Three Years

    NASA Astrophysics Data System (ADS)

    Yenier, E.; Moores, A. O.; Baturan, D.; Spriggs, N.

    2017-12-01

    In response to induced seismicity observed in western Canada, existing public networks have been densified and a number of private networks have been deployed to closely monitor the earthquakes induced by hydraulic fracturing operations in the region. These networks have produced an unprecedented volume of seismic data, which can be used to map pre-existing geological structures and understand their activation mechanisms. Here, we present insights gained over the past three years from induced seismicity monitoring (ISM) for some of the most active operators in Canada. First, we discuss the benefits of high-quality ISM data sets for making operational decisions and how their value largely depends on choice of instrumentation, seismic network design and data processing techniques. Using examples from recent research studies, we illustrate the key role of robust modeling of regional source, attenuation and site attributes on the accuracy of event magnitudes, ground motion estimates and induced seismicity hazard assessment. Finally, acknowledging that the ultimate goal of ISM networks is assisting operators to manage induced seismic risk, we share some examples of how ISM data products can be integrated into existing protocols for developing effective risk management strategies.

  7. External quality-assurance programs managed by the U.S. Geological Survey in support of the National Atmospheric Deposition Program/National Trends Network

    USGS Publications Warehouse

    Latysh, Natalie E.; Wetherbee, Gregory A.

    2005-01-01

    The U.S. Geological Survey, Branch of Quality Systems, operates the external quality-assurance programs for the National Atmospheric Deposition Program/National Trends Network (NADP/NTN). Beginning in 1978, six different programs have been implemented?the intersite-comparison program, the blind-audit program, the sample-handling evaluation program, the field-audit program, the interlaboratory-comparison program, and the collocated-sampler program. Each program was designed to measure error contributed by specific components in the data-collection process. The intersite-comparison program, which was discontinued in 2004, was designed to assess the accuracy and reliability of field pH and specific-conductance measurements made by site operators. The blind-audit and sample-handling evaluation programs, which also were discontinued in 2002 and 2004, respectively, assessed contamination that may result from sampling equipment and routine handling and processing of the wet-deposition samples. The field-audit program assesses the effects of sample handling, processing, and field exposure. The interlaboratory-comparison program evaluates bias and precision of analytical results produced by the contract laboratory for NADP, the Illinois State Water Survey, Central Analytical Laboratory, and compares its performance with the performance of international laboratories. The collocated-sampler program assesses the overall precision of wet-deposition data collected by NADP/NTN. This report documents historical operations and the operating procedures for each of these external quality-assurance programs. USGS quality-assurance information allows NADP/NTN data users to discern between actual environmental trends and inherent measurement variability.

  8. a Web Api and Web Application Development for Dissemination of Air Quality Information

    NASA Astrophysics Data System (ADS)

    Şahin, K.; Işıkdağ, U.

    2017-11-01

    Various studies have been carried out since 2005 under the leadership of Ministry of Environment and Urbanism of Turkey, in order to observe the quality of air in Turkey, to develop new policies and to develop a sustainable air quality management strategy. For this reason, a national air quality monitoring network has been developed providing air quality indices. By this network, the quality of the air has been continuously monitored and an important information system has been constructed in order to take precautions for preventing a dangerous situation. The biggest handicap in the network is the data access problem for instant and time series data acquisition and processing because of its proprietary structure. Currently, there is no service offered by the current air quality monitoring system for exchanging information with third party applications. Within the context of this work, a web service has been developed to enable location based querying of the current/past air quality data in Turkey. This web service is equipped with up-todate and widely preferred technologies. In other words, an architecture is chosen in which applications can easily integrate. In the second phase of the study, a web-based application was developed to test the developed web service and this testing application can perform location based acquisition of air-quality data. This makes it possible to easily carry out operations such as screening and examination of the area in the given time-frame which cannot be done with the national monitoring network.

  9. A physical layer perspective on access network sharing

    NASA Astrophysics Data System (ADS)

    Pfeiffer, Thomas

    2015-12-01

    Unlike in copper or wireless networks, there is no sharing of resources in fiber access networks yet, other than bit stream access or cable sharing, in which the fibers of a cable are let to one or multiple operators. Sharing optical resources on a single fiber among multiple operators or different services has not yet been applied. While this would allow for a better exploitation of installed infrastructures, there are operational issues which still need to be resolved, before this sharing model can be implemented in networks. Operating multiple optical systems and services over a common fiber plant, autonomously and independently from each other, can result in mutual distortions on the physical layer. These distortions will degrade the performance of the involved systems, unless precautions are taken in the infrastructure hardware to eliminate or to reduce them to an acceptable level. Moreover, the infrastructure needs to be designed such as to support different system technologies and to ensure a guaranteed quality of the end-to-end connections. In this paper, suitable means are proposed to be introduced in fiber access infrastructures that will allow for shared utilization of the fibers while safeguarding the operational needs and business interests of the involved parties.

  10. Quality of surface water in Missouri, water year 2015

    USGS Publications Warehouse

    Barr, Miya N.; Heimann, David C.

    2016-11-14

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams and springs throughout Missouri known as the Ambient Water-Quality Monitoring Network. During water year 2015 (October 1, 2014, through September 30, 2015), data were collected at 74 stations—72 Ambient Water-Quality Monitoring Network stations and 2 U.S. Geological Survey National Stream Quality Assessment Network stations. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, Escherichia coli bacteria, fecal coliform bacteria, dissolved nitrate plus nitrite as nitrogen, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 71 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak streamflows, monthly mean streamflows, and 7-day low flows is presented.

  11. Quality of surface water in Missouri, water year 2011

    USGS Publications Warehouse

    Barr, Miya N.

    2012-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2011 water year (October 1, 2010, through September 30, 2011), data were collected at 75 stations—72 Ambient Water-Quality Monitoring Network stations, 2 U.S. Geological Survey National Stream Quality Accounting Network stations, and 1 spring sampled in cooperation with the U.S. Forest Service. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, Escherichia coli bacteria, dissolved nitrate plus nitrite, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 72 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and 7-day low flow is presented.

  12. Quality of surface water in Missouri, water year 2014

    USGS Publications Warehouse

    Barr, Miya N.

    2015-12-18

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams and springs throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2014 water year (October 1, 2013, through September 30, 2014), data were collected at 74 stations—72 Ambient Water-Quality Monitoring Network stations and 2 U.S. Geological Survey National Stream Quality Assessment Network stations. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, Escherichia coli bacteria, fecal coliform bacteria, dissolved nitrate plus nitrite as nitrogen, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 71 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and 7-day low flow is presented.

  13. Quality of surface water in Missouri, water year 2010

    USGS Publications Warehouse

    Barr, Miya N.

    2011-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designs and operates a series of monitoring stations on streams throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2010 water year (October 1, 2009 through September 30, 2010), data were collected at 75 stations-72 Ambient Water-Quality Monitoring Network stations, 2 U.S. Geological Survey National Stream Quality Accounting Network stations, and 1 spring sampled in cooperation with the U.S. Forest Service. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, Escherichia coli bacteria, dissolved nitrate plus nitrite, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 72 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and 7-day low flow is presented.

  14. A black carbon air quality network

    NASA Astrophysics Data System (ADS)

    Kirchstetter, T.; Caubel, J.; Cados, T.; Preble, C.; Rosen, A.

    2016-12-01

    We developed a portable, power efficient black carbon sensor for deployment in an air quality network in West Oakland, California. West Oakland is a San Francisco Bay Area residential/industrial community adjacent to regional port and rail yard facilities, and is surrounded by major freeways. As such, the community is affected by diesel particulate matter emissions from heavy-duty diesel trucks, locomotives, and ships associated with freight movement. In partnership with Environmental Defense Fund, the Bay Area Air Quality Management District, and the West Oakland Environmental Indicators Project, we are collaborating with community members to build and operate a 100-sensor black carbon measurement network for a period of several months. The sensor employs the filter-based light transmission method to measure black carbon. Each sensor node in the network transmits data hourly via SMS text messages. Cost, power consumption, and performance are considered in choosing components (e.g., pump) and operating conditions (e.g., sample flow rate). In field evaluation trials over several weeks at three monitoring locations, the sensor nodes provided black carbon concentrations comparable to commercial instruments and ran autonomously for a week before sample filters and rechargeable batteries needed to be replaced. Buildup to the 100-sensor network is taking place during Fall 2016 and will overlap with other ongoing air monitoring projects and monitoring platforms in West Oakland. Sensors will be placed along commercial corridors, adjacent to freeways, upwind of and within the Port, and throughout the residential community. Spatial and temporal black carbon concentration patterns will help characterize pollution sources and demonstrate the value of sensing networks for characterizing intra-urban air pollution concentrations and exposure to air pollution.

  15. Assessing measurement uncertainty in meteorology in urban environments

    NASA Astrophysics Data System (ADS)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  16. Reliable and Fault-Tolerant Software-Defined Network Operations Scheme for Remote 3D Printing

    NASA Astrophysics Data System (ADS)

    Kim, Dongkyun; Gil, Joon-Min

    2015-03-01

    The recent wide expansion of applicable three-dimensional (3D) printing and software-defined networking (SDN) technologies has led to a great deal of attention being focused on efficient remote control of manufacturing processes. SDN is a renowned paradigm for network softwarization, which has helped facilitate remote manufacturing in association with high network performance, since SDN is designed to control network paths and traffic flows, guaranteeing improved quality of services by obtaining network requests from end-applications on demand through the separated SDN controller or control plane. However, current SDN approaches are generally focused on the controls and automation of the networks, which indicates that there is a lack of management plane development designed for a reliable and fault-tolerant SDN environment. Therefore, in addition to the inherent advantage of SDN, this paper proposes a new software-defined network operations center (SD-NOC) architecture to strengthen the reliability and fault-tolerance of SDN in terms of network operations and management in particular. The cooperation and orchestration between SDN and SD-NOC are also introduced for the SDN failover processes based on four principal SDN breakdown scenarios derived from the failures of the controller, SDN nodes, and connected links. The abovementioned SDN troubles significantly reduce the network reachability to remote devices (e.g., 3D printers, super high-definition cameras, etc.) and the reliability of relevant control processes. Our performance consideration and analysis results show that the proposed scheme can shrink operations and management overheads of SDN, which leads to the enhancement of responsiveness and reliability of SDN for remote 3D printing and control processes.

  17. A Mobility-Aware QoS Signaling Protocol for Ambient Networks

    NASA Astrophysics Data System (ADS)

    Jeong, Seong-Ho; Lee, Sung-Hyuck; Bang, Jongho

    Mobility-aware quality of service (QoS) signaling is crucial to provide seamless multimedia services in the ambient environment where mobile nodes may move frequently between different wireless access networks. The mobility of an IP-based node in ambient networks affects routing paths, and as a result, can have a significant impact on the operation and state management of QoS signaling protocols. In this paper, we first analyze the impact of mobility on QoS signaling protocols and how the protocols operate in mobility scenarios. We then propose an efficient mobility-aware QoS signaling protocol which can operate adaptively in ambient networks. The key features of the protocol include the fast discovery of a crossover node where the old and new paths converge or diverge due to handover and the localized state management for seamless services. Our analytical and simulation/experimental results show that the proposed/implemented protocol works better than existing protocols in the IP-based mobile environment.

  18. Operational Data Quality Assessment of the Combined PBO, TLALOCNet and COCONet Real-Time GNSS Networks

    NASA Astrophysics Data System (ADS)

    Hodgkinson, K. M.; Mencin, D.; Fox, O.; Walls, C. P.; Mann, D.; Blume, F.; Berglund, H. T.; Phillips, D.; Meertens, C. M.; Mattioli, G. S.

    2015-12-01

    The GAGE facility, managed by UNAVCO, currently operates a network of ~460, real-time, high-rate GNSS stations (RT-GNSS). The majority of these RT stations are part of the Earthscope PBO network, which spans the western US Pacific North-American plate boundary. Approximately 50 are distributed throughout the Mexico and Caribbean region funded by the TLALOCNet and COCONet projects. The entire network is processed in real-time at UNAVCO using Precise Point Positioning (PPP). The real-time streams are freely available to all and user demand has grown almost exponentially since 2010. Data usage is multidisciplinary, including tectonic and volcanic deformation studies, meteorological applications, atmospheric science research in addition to use by national, state and commercial entities. 21 RT-GNSS sites in California now include 200-sps accelerometers for the development of Earthquake Early Warning systems. All categories of users of real-time streams have similar requirements, reliable, low-latency, high-rate, and complete data sets. To meet these requirements, UNAVCO tracks the latency and completeness of the incoming raw observations and also is developing tools to monitor the quality of the processed data streams. UNAVCO is currently assessing the precision, accuracy and latency of solutions from various PPP software packages. Also under review are the data formats UNAVCO distributes; for example, the PPP solutions are currently distributed in NMEA format, but other formats such as SEED or GeoJSON may be preferred by different user groups to achieve specific mission objectives. In this presentation we will share our experiences of the challenges involved in the data operations of a continental-scale, multi-project, real-time GNSS network, summarize the network's performance in terms of latency and completeness, and present the comparisons of PPP solutions using different PPP processing techniques.

  19. Node Redeployment Algorithm Based on Stratified Connected Tree for Underwater Sensor Networks

    PubMed Central

    Liu, Jun; Jiang, Peng; Wu, Feng; Yu, Shanen; Song, Chunyue

    2016-01-01

    During the underwater sensor networks (UWSNs) operation, node drift with water environment causes network topology changes. Periodic node location examination and adjustment are needed to maintain good network monitoring quality as long as possible. In this paper, a node redeployment algorithm based on stratified connected tree for UWSNs is proposed. At every network adjustment moment, self-examination and adjustment on node locations are performed firstly. If a node is outside the monitored space, it returns to the last location recorded in its memory along straight line. Later, the network topology is stratified into a connected tree that takes the sink node as the root node by broadcasting ready information level by level, which can improve the network connectivity rate. Finally, with synthetically considering network coverage and connectivity rates, and node movement distance, the sink node performs centralized optimization on locations of leaf nodes in the stratified connected tree. Simulation results show that the proposed redeployment algorithm can not only keep the number of nodes in the monitored space as much as possible and maintain good network coverage and connectivity rates during network operation, but also reduce node movement distance during node redeployment and prolong the network lifetime. PMID:28029124

  20. Accountability in the Greek Higher Education System as a High-Stakes Policymaking Instrument

    ERIC Educational Resources Information Center

    Gouvias, Dionysios

    2012-01-01

    One of the main aims of the so-called common "European Higher Education Area" is the creation of a European framework for higher education (HE) qualifications and a network of "quality assurance agencies." In the light of the above processes, recent legislation in Greece on quality assurance in HE and the operation and…

  1. Technical and Organizational Lessons Learned From More Than One Decade of the International GNSS Service Global Tracking Network

    NASA Astrophysics Data System (ADS)

    Moore, A. W.

    2007-12-01

    The International GNSS Service (IGS) is a voluntary collaboration of more than 200 worldwide agencies that pool resources to generate precise GPS and GLONASS products. The foundation of the IGS is a global network of 385 permanent, continuous, geodetic-quality stations independently operated by about 100 agencies. The IGS Central Bureau develops minimum functional requirements and operational standards that enable the individual stations' data to be used coherently in global analyses, but the IGS remains vendor neutral, leaving procurement decisions and implementation details to the individual agencies. The IGS network is hence quite heterogeneous in instrumentation, station management strategies, and culture; these diversities bring both strengths and challenges in coordination. This presentation will detail the IGS's approaches, successes, and opportunities for improvement in coordinating and monitoring the collaborative network.

  2. EGU2013 SM1.4/GI1.6 session: "Improving seismic networks performances: from site selection to data integration"

    NASA Astrophysics Data System (ADS)

    Pesaresi, D.; Busby, R.

    2013-08-01

    The number and quality of seismic stations and networks in Europe continually improves, nevertheless there is always scope to optimize their performance. In this session we welcomed contributions from all aspects of seismic network installation, operation and management. This includes site selection; equipment testing and installation; planning and implementing communication paths; policies for redundancy in data acquisition, processing and archiving; and integration of different datasets including GPS and OBS.

  3. Utah's Regional/Urban ANSS Seismic Network---Strategies and Tools for Quality Performance

    NASA Astrophysics Data System (ADS)

    Burlacu, R.; Arabasz, W. J.; Pankow, K. L.; Pechmann, J. C.; Drobeck, D. L.; Moeinvaziri, A.; Roberson, P. M.; Rusho, J. A.

    2007-05-01

    The University of Utah's regional/urban seismic network (224 stations recorded: 39 broadband, 87 strong-motion, 98 short-period) has become a model for locally implementing the Advanced National Seismic System (ANSS) because of successes in integrating weak- and strong-motion recording and in developing an effective real-time earthquake information system. Early achievements included implementing ShakeMap, ShakeCast, point-to- multipoint digital telemetry, and an Earthworm Oracle database, as well as in-situ calibration of all broadband and strong-motion stations and submission of all data and metadata into the IRIS DMC. Regarding quality performance, our experience as a medium-size regional network affirms the fundamental importance of basics such as the following: for data acquisition, deliberate attention to high-quality field installations, signal quality, and computer operations; for operational efficiency, a consistent focus on professional project management and human resources; and for customer service, healthy partnerships---including constant interactions with emergency managers, engineers, public policy-makers, and other stakeholders as part of an effective state earthquake program. (Operational cost efficiencies almost invariably involve trade-offs between personnel costs and the quality of hardware and software.) Software tools that we currently rely on for quality performance include those developed by UUSS (e.g., SAC and shell scripts for estimating local magnitudes) and software developed by other organizations such as: USGS (Earthworm), University of Washington (interactive analysis software), ISTI (SeisNetWatch), and IRIS (PDCC, BUD tools). Although there are many pieces, there is little integration. One of the main challenges we face is the availability of a complete and coherent set of tools for automatic and post-processing to assist in achieving the goals/requirements set forth by ANSS. Taking our own network---and ANSS---to the next level will require standardized, well-designed, and supported software. Other advances in seismic network performance will come from diversified instrumentation. We have recently shown the utility of incorporating strong-motion data (even from soil sites) into the routine analysis of local seismicity, and have also collocated an acoustic array with a broadband seismic station (in collaboration with Southern Methodist University). For the latter experiment, the purpose of collocated seismic and infrasound sensors is to (1) further an understanding of the physics associated with the generation and the propagation of seismic and low-frequency acoustic energy from shallow sources and (2) explore the potential for blast discrimination and improved source location using seismic and infrasonic data in a synergetic way.

  4. Automatic classification of DMSA scans using an artificial neural network

    NASA Astrophysics Data System (ADS)

    Wright, J. W.; Duguid, R.; Mckiddie, F.; Staff, R. T.

    2014-04-01

    DMSA imaging is carried out in nuclear medicine to assess the level of functional renal tissue in patients. This study investigated the use of an artificial neural network to perform diagnostic classification of these scans. Using the radiological report as the gold standard, the network was trained to classify DMSA scans as positive or negative for defects using a representative sample of 257 previously reported images. The trained network was then independently tested using a further 193 scans and achieved a binary classification accuracy of 95.9%. The performance of the network was compared with three qualified expert observers who were asked to grade each scan in the 193 image testing set on a six point defect scale, from ‘definitely normal’ to ‘definitely abnormal’. A receiver operating characteristic analysis comparison between a consensus operator, generated from the scores of the three expert observers, and the network revealed a statistically significant increase (α < 0.05) in performance between the network and operators. A further result from this work was that when suitably optimized, a negative predictive value of 100% for renal defects was achieved by the network, while still managing to identify 93% of the negative cases in the dataset. These results are encouraging for application of such a network as a screening tool or quality assurance assistant in clinical practice.

  5. Broadening the Quality and Capabilities of the EarthScope Alaska Transportable Array

    NASA Astrophysics Data System (ADS)

    Busby, R. W.

    2016-12-01

    In 2016, the EarthScope Transportable Array (TA) program will have 195 broadband seismic stations operating in Alaska and western Canada. This ambitious project will culminate in a network of 268 new or upgraded real-time seismic stations operating through 2019. The challenging environmental conditions and the remoteness of Alaska have motivated a new method for constructing a high-quality, temporary seismic network. The Alaska TA station design builds on experience of the Lower 48 TA deployment and adds design requirements because most stations are accessible only by helicopter. The stations utilize new high-performance posthole sensors, a specially built hammer/auger drill, and lightweight lithium ion batteries to minimize sling loads. A uniform station design enables a modest crew to build the network on a short timeline and operate them through the difficult conditions of rural Alaska. The Alaska TA deployment has increased the quality of seismic data, with some well-sited 2-3 m posthole stations approaching the performance of permanent Global Seismic Network stations emplaced in 100 m boreholes. The real-time data access, power budget, protective enclosure and remote logistics of these TA stations has attracted collaborations with NASA, NOAA, USGS, AVO and other organizations to add auxiliary sensors to the suite of instruments at many TA stations. Strong motion sensors have been added to (18) stations near the subduction trench to complement SM stations operated by AEC, ANSS and GSN. All TA and most upgraded stations have pressure and infrasound sensors, and 150 TA stations are receiving a Vaisala weather sensor, supplied by the National Weather Service Alaska Region and NASA, capable of measuring temperature, pressure, relative humidity, wind speed/direction, and precipitation intensity. We are also installing about (40) autonomous soil temperature profile kits adjacent to northern stations. While the priority continues to be collecting seismic data, these additional strong motion, atmospheric, and soil temperature sensors may motivate the desire extend the operation of certain stations in cooperation with these organizations. The TA has always been amenable to partnerships in the research and education communities that extend the capabilities and reach of the EarthScope Transportable Array.

  6. Unified study of Quality of Service (QoS) in OPS/OBS networks

    NASA Astrophysics Data System (ADS)

    Hailu, Dawit Hadush; Lema, Gebrehiwet Gebrekrstos; Yekun, Ephrem Admasu; Kebede, Samrawit Haylu

    2017-07-01

    With the growth of Internet traffic, an inevitable use of optical networks provide a large bandwidth, fast data transmission rates and Quality of Service (QoS) support. Currently, Optical Burst Switched (OBS)/Optical Packet Switched (OPS) networks are under study as future solutions for addressing the increase demand of Internet traffic. However, due to their high blocking probability in the intermediate nodes they have been delayed in the industries. Packet loss in OBS/OPS networks is mainly occur due to contention. Hence, the contribution of this study is to analyze the file loss ratio (FLR), packet overhead and number of disjoint paths, and processing delay over Coded Packet Transport (CPT) scheme for OBS/OPS network using simulation. The simulations show that CPT scheme reduces the FLR in OBS/OPS network for the evaluated scenarios since the data packets are chopped off into blocks of the data packet for transmission over a network. Simulation results for secrecy and survivability are verified with the help of the analytical model to define the operational range of CPT scheme.

  7. Demonstration of an SOA-assisted open metro-access infrastructure for heterogeneous services.

    PubMed

    Schmuck, H; Bonk, R; Poehlmann, W; Haslach, C; Kuebart, W; Karnick, D; Meyer, J; Fritzsche, D; Weis, E; Becker, J; Freude, W; Pfeiffer, T

    2014-01-13

    An open converged metro-access network approach allows for sharing optical layer resources like fibers and optical spectrum among different services and operators. We demonstrated experimentally the feasibility of such a concept by the simultaneous operation of multiple services showing different modulation formats and multiplexing techniques. Flexible access nodes are implemented including semiconductor optical amplifiers to create a transparent and reconfigurable optical ring network. The impact of cascaded optical amplifiers on the signal quality is studied along the ring. In addition, the influence of high power rival signals in the same waveband and in the same fiber is analyzed.

  8. Telestroke network fundamentals.

    PubMed

    Meyer, Brett C; Demaerschalk, Bart M

    2012-10-01

    The objectives of this manuscript are to identify key components to maintaining the logistic and/or operational sustainability of a telestroke network, to identify best practices to be considered for assessment and management of acute stroke when planning for and developing a telestroke network, to show practical steps to enable progress toward implementing a telestroke solution for optimizing acute stroke care, to incorporate evidence-based practice guidelines and care pathways into a telestroke network, to emphasize technology variables and options, and to propose metrics to use when determining the performance, outcomes, and quality of a telestroke network. Copyright © 2012 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  9. A new evaluation of the USGS streamgaging network

    USGS Publications Warehouse

    ,

    1998-01-01

    Since 1889, the U.S. Geological Survey (USGS) has operated a streamgaging network to collect information about the Nation's water resources. It is a multipurpose network funded by the USGS and many other Federal, State and local agencies. Individual streamgaging stations are supported for specific purposes such as water allocation, reservoir operations, or regulating permit requirements, but the data are used by others for many purposes. Collectively, the USGS streamgaging network produces valuable data that are used for current forecasting and operational decisions as well as long-term resource planning, infrastructure design, and flood hazard mitigation. The guiding principles of the network are: Streamgaging stations are funded by the USGS and many agencies to achieve the Federal mission goals of the USGS and the individual goals of the funding agencies. Data are freely available to the public and all partners. USGS operates the network on behalf of all partners, which achieves economies because it eliminates the need for multiple infrastructures for testing equipment, providing training to staff, developing and maintaining the communications and database systems, and conducting quality assurance. USGS brings the capability of its national staff to bear on challenging problems such as responding to catastrophic floods or finding solutions to unique streamgaging conditions. This report has been prepared in response to a request from the U.S. House of Representatives Subcommittee on Interior Appropriations in its report to accompany H.R. 4193.

  10. Design of a national distributed health data network.

    PubMed

    Maro, Judith C; Platt, Richard; Holmes, John H; Strom, Brian L; Hennessy, Sean; Lazarus, Ross; Brown, Jeffrey S

    2009-09-01

    A distributed health data network is a system that allows secure remote analysis of separate data sets, each comprising a different medical organization's or health plan's records. Distributed health data networks are currently being planned that could cover millions of people, permitting studies of comparative clinical effectiveness, best practices, diffusion of medical technologies, and quality of care. These networks could also support assessment of medical product safety and other public health needs. Distributed network technologies allow data holders to control all uses of their data, which overcomes many practical obstacles related to confidentiality, regulation, and proprietary interests. Some of the challenges and potential methods of operation of a multipurpose, multi-institutional distributed health data network are described.

  11. Water-quality characteristics of Montana streams in a statewide monitoring network, 1999-2003

    USGS Publications Warehouse

    Lambing, John H.; Cleasby, Thomas E.

    2006-01-01

    A statewide monitoring network of 38 sites was operated during 1999-2003 in cooperation with the Montana Department of Environmental Quality to provide a broad geographic base of water-quality information on Montana streams. The purpose of this report is to summarize and describe the water-quality characteristics for those sites. Samples were collected at U.S. Geological Survey streamflow-gaging stations in the Missouri, Yellowstone, and Columbia River basins for stream properties, nutrients, suspended sediment, major ions, and selected trace elements. Mean annual streamflows were below normal during the period, which likely influenced water quality. Continuous water-temperature monitors were operated at 26 sites. The median of daily mean water temperatures for the June-August summer period ranged from 12.5 degC at Kootenai River below Libby Dam to 23.0 degC at Poplar River near Poplar and Tongue River at Miles City. In general, sites in the Missouri River basin commonly had the highest water temperatures. Median daily mean summer water temperatures at four sites (Jefferson River near Three Forks, Missouri River at Toston, Judith River near Winifred, and Poplar River near Poplar) classified as supporting or marginally supporting cold-water biota exceeded the general guideline of 19.4 degC for cold-water biota. Median daily mean temperatures at sites in the network classified as supporting warm-water biota did not exceed the guideline of 26.7 degC for warm-water biota, although several sites exceeded the warm-water guideline on several days during the summer. More...

  12. Data quality of seismic records from the Tohoku, Japan earthquake as recorded across the Albuquerque Seismological Laboratory networks

    USGS Publications Warehouse

    Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.

    2012-01-01

    Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.

  13. Health information exchange: 'lex parsimoniae'.

    PubMed

    Overhage, J Marc

    2007-01-01

    The country has identified health information exchange (HIE) as an essential strategy to address our crisis of cost, quality, and safety in health care. The Nationwide Health Information Network (NHIN) will consist of a "network of networks"--interconnected local or regional HIEs. We must create policy and technical interfaces that allow these local exchanges to share data with each other. More importantly, we must create nationwide exchanges that are consistent across the country. The should be parsimonious--not overly constraining how the exchanges operate and maintaining separation between the applications that provide functionality and the network that supports HIE.

  14. Quality Evaluation of Land-Cover Classification Using Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Dang, Y.; Zhang, J.; Zhao, Y.; Luo, F.; Ma, W.; Yu, F.

    2018-04-01

    Land-cover classification is one of the most important products of earth observation, which focuses mainly on profiling the physical characters of the land surface with temporal and distribution attributes and contains the information of both natural and man-made coverage elements, such as vegetation, soil, glaciers, rivers, lakes, marsh wetlands and various man-made structures. In recent years, the amount of high-resolution remote sensing data has increased sharply. Accordingly, the volume of land-cover classification products increases, as well as the need to evaluate such frequently updated products that is a big challenge. Conventionally, the automatic quality evaluation of land-cover classification is made through pixel-based classifying algorithms, which lead to a much trickier task and consequently hard to keep peace with the required updating frequency. In this paper, we propose a novel quality evaluation approach for evaluating the land-cover classification by a scene classification method Convolutional Neural Network (CNN) model. By learning from remote sensing data, those randomly generated kernels that serve as filter matrixes evolved to some operators that has similar functions to man-crafted operators, like Sobel operator or Canny operator, and there are other kernels learned by the CNN model that are much more complex and can't be understood as existing filters. The method using CNN approach as the core algorithm serves quality-evaluation tasks well since it calculates a bunch of outputs which directly represent the image's membership grade to certain classes. An automatic quality evaluation approach for the land-cover DLG-DOM coupling data (DLG for Digital Line Graphic, DOM for Digital Orthophoto Map) will be introduced in this paper. The CNN model as an robustness method for image evaluation, then brought out the idea of an automatic quality evaluation approach for land-cover classification. Based on this experiment, new ideas of quality evaluation of DLG-DOM coupling land-cover classification or other kinds of labelled remote sensing data can be further studied.

  15. Towards an operational lidar network across the UK

    NASA Astrophysics Data System (ADS)

    Adam, Mariana; Horseman, Andrew; Turp, Myles; Buxmann, Joelle; Sugier, Jacqueline

    2015-04-01

    The Met Office has been operating a ceilometer network since 2012. This network consists of 11 Jenoptik Nimbus ceilometers (operating at 1064 nm) and 32 Vaisala ceilometers (25 CL31, operating at 910 nm and 7 CT25 operating at 905 nm). The data are available in near real time (NRT) (15 min for Jenoptik and 1 h for Vaisala). In 2014, six additional stations from Met Éireann (Ireland) were added to the network (5 CL31 and 1 CT25). Visualisation of attenuated backscatter and cloud base height are available from http://www.metoffice.gov.uk/public/lidarnet/lcbr-network.html. The main customers are the Met Office Hazard Centre which provides a quick response to customers requiring forecast information to manage a wide variety of environmental incidents and the London Volcanic Ash Advisory Centre (VAAC), also based at the Met Office, which monitor volcanic ash events. As a response to the strong impact of the Eyjafjallajökull eruption in 2010, the UK Civil Aviation Authority (CAA) financed a lidar - sunphotometer network for NRT monitoring of the volcanic ash. This new network will consist of nine fixed sites and one mobile unit, each equipped with a lidar and a sunphotometer. The sunphotometers were acquired from Cimel Electronique (CE318-NE DPS9). The lidars were acquired from Raymetrics. They operate at 355 nm and have receiving channels at 355 nm (parallel and perpendicular polarization) and 387 nm (N2 Raman). The first two lidar systems were deployed in November 2014 at Camborne (SW England) and the data are under evaluation. The network is planned to be operational in 2016. Initially, the NRT data will consist of quick look plots of the total range corrected signal and volume depolarization ratio from lidar and aerosol optical depth from sunphotometer (including 355nm, through interpolation). During EGU presentation, the following features will be emphasized: IT considerations for the operational network, data quality assurance (including error estimates) for the ceilometer network on one hand and for sunphotometer and lidar network on the other hand, technical presentation of the lidar, first results from lidars and sunphotometer, future considerations about other potential NRT data products (aerosol extinction and backscatter coefficients, particles linear depolarization ratio), NRT ceilometer data within the Hazard Centre and VAAC framework.

  16. Inverse simulation system for manual-controlled rendezvous and docking based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai

    2016-09-01

    The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.

  17. Integrated heart failure telemonitoring system for homecare.

    PubMed

    Lobodzinski, S Suave; Jadalla, Ahlam A

    2010-01-01

    The integrated telemonitoring system (ITS) for homecare has been designed to improve quality of care as measured by increased nursing productivity, improved patients' clinical and behavioral outcomes and reduction of cost. The system incorporates managerial, organizational, operational and clinical tasks optimized for delivery of quality care through telemonitoring. A secure, multi-modal computer network that integrates homecare nurses, patients and those who care into one seamless environment has been developed. The network brings together a new generation of small, hand-held, wireless terminals used by nurses and patients with a HIPPA-compliant electronic patient record system at the caregiver's site. Wireless terminals use Gobi multi-standard networking technology for connectivity to any available wireless network. The unique features of ITS include a) picture recognition technology capable of extracting numeric data from in-home physiological signal monitor displays that include blood pressure, weight, oxygen saturation, transmission of lung sounds, and capturing echocardiography and electrocardiography data from mobile units; b) in-home caregiver-assisted interactive examinations of signs and symptoms that include visual impressions of ankle swelling, jugular vein distension measurement, and weight gain; c) video-conference capability, facilitating face-to-face two-way communication of nursing personnel with the patients. The ITS network has been designed to improve patients' clinical and behavioral outcomes, increase nursing productivity, and reduce the cost of homecare. Patients' co-operation and compliance has been achieved through use of easy-to-use videoconferencing terminals.

  18. Low-Cost Sensor Units for Measuring Urban Air Quality

    NASA Astrophysics Data System (ADS)

    Popoola, O. A.; Mead, M.; Stewart, G.; Hodgson, T.; McLoed, M.; Baldovi, J.; Landshoff, P.; Hayes, M.; Calleja, M.; Jones, R.

    2010-12-01

    Measurements of selected key air quality gases (CO, NO & NO2) have been made with a range of miniature low-cost sensors based on electrochemical gas sensing technology incorporating GPS and GPRS for position and communication respectively. Two types of simple to operate sensors units have been designed to be deployed in relatively large numbers. Mobile handheld sensor units designed for operation by members of the public have been deployed on numerous occasions including in Cambridge, London and Valencia. Static sensor units have also been designed for long-term autonomous deployment on existing street furniture. A study was recently completed in which 45 sensor units were deployed in the Cambridge area for a period of 3 months. Results from these studies indicate that air quality varies widely both spatially and temporally. The widely varying concentrations found suggest that the urban environment cannot be fully understood using limited static site (AURN) networks and that a higher resolution, more dispersed network is required to better define air quality in the urban environment. The results also suggest that higher spatial and temporal resolution measurements could improve knowledge of the levels of individual exposure in the urban environment.

  19. Water-resources investigations in Tennessee; programs and activities of the U.S. Geological Survey, 1988-1989

    USGS Publications Warehouse

    Quinones, Ferdinand; Balthrop, B.H.; Baker, E.G.

    1989-01-01

    This report contains a summation of water resources projects which were active in the Tennessee District during 1988 or 1989. Given in each summary is the name of the project chief, the objective of the project, the progress of results of the study to date, and the name of the cooperator. The basic data programs conducted by the Tennessee District provide streamflow, quality of water, and groundwater levels information essential to the assessment and management of the State 's water resources. Long-term streamflow, quality of water, and groundwater levels networks are operated as part of the Hydrologic Data Section. Field operations are about equally divided among field offices in Memphis, Nashville, and Knoxville. The data collected as part of the networks are published in the series of annual data reports entitled ' Water Resources Data for Tennessee'. (USGS)

  20. Statistical modelling of networked human-automation performance using working memory capacity.

    PubMed

    Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja

    2014-01-01

    This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.

  1. Network operability of ground-based microwave radiometers: Calibration and standardization efforts

    NASA Astrophysics Data System (ADS)

    Pospichal, Bernhard; Löhnert, Ulrich; Küchler, Nils; Czekala, Harald

    2017-04-01

    Ground-based microwave radiometers (MWR) are already widely used by national weather services and research institutions all around the world. Most of the instruments operate continuously and are beginning to be implemented into data assimilation for atmospheric models. Especially their potential for continuously observing boundary-layer temperature profiles as well as integrated water vapor and cloud liquid water path makes them valuable for improving short-term weather forecasts. However until now, most MWR have been operated as stand-alone instruments. In order to benefit from a network of these instruments, standardization of calibration, operation and data format is necessary. In the frame of TOPROF (COST Action ES1303) several efforts have been undertaken, such as uncertainty and bias assessment, or calibration intercomparison campaigns. The goal was to establish protocols for providing quality controlled (QC) MWR data and their uncertainties. To this end, standardized calibration procedures for MWR have been developed and recommendations for radiometer users compiled. Based on the results of the TOPROF campaigns, a new, high-accuracy liquid-nitrogen calibration load has been introduced for MWR manufactured by Radiometer Physics GmbH (RPG). The new load improves the accuracy of the measurements considerably and will lead to even more reliable atmospheric observations. Next to the recommendations for set-up, calibration and operation of ground-based MWR within a future network, we will present homogenized methods to determine the accuracy of a running calibration as well as means for automatic data quality control. This sets the stage for the planned microwave calibration center at JOYCE (Jülich Observatory for Cloud Evolution), which will be shortly introduced.

  2. Quality of surface water in Missouri, water year 2009

    USGS Publications Warehouse

    Barr, Miya N.

    2010-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designs and operates a series of monitoring stations on streams throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2009 water year (October 1, 2008, through September 30, 2009), data were collected at 75 stations-69 Ambient Water-Quality Monitoring Network stations, 2 U.S. Geological Survey National Stream Quality Accounting Network stations, 1 spring sampled in cooperation with the U.S. Forest Service, and 3 stations sampled in cooperation with the Elk River Watershed Improvement Association. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, Escherichia coli bacteria, dissolved nitrate plus nitrite, total phosphorus, dissolved and total recoverable lead and zinc, and select pesticide compound summaries are presented for 72 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and seven-day low flow is presented.

  3. Cardiac ultrasonography over 4G wireless networks using a tele-operated robot

    PubMed Central

    Panayides, Andreas S.; Jossif, Antonis P.; Christoforou, Eftychios G.; Vieyres, Pierre; Novales, Cyril; Voskarides, Sotos; Pattichis, Constantinos S.

    2016-01-01

    This Letter proposes an end-to-end mobile tele-echography platform using a portable robot for remote cardiac ultrasonography. Performance evaluation investigates the capacity of long-term evolution (LTE) wireless networks to facilitate responsive robot tele-manipulation and real-time ultrasound video streaming that qualifies for clinical practice. Within this context, a thorough video coding standards comparison for cardiac ultrasound applications is performed, using a data set of ten ultrasound videos. Both objective and subjective (clinical) video quality assessment demonstrate that H.264/AVC and high efficiency video coding standards can achieve diagnostically-lossless video quality at bitrates well within the LTE supported data rates. Most importantly, reduced latencies experienced throughout the live tele-echography sessions allow the medical expert to remotely operate the robot in a responsive manner, using the wirelessly communicated cardiac ultrasound video to reach a diagnosis. Based on preliminary results documented in this Letter, the proposed robotised tele-echography platform can provide for reliable, remote diagnosis, achieving comparable quality of experience levels with in-hospital ultrasound examinations. PMID:27733929

  4. Optical slotted circuit switched network: a bandwidth efficient alternative to wavelength-routed network

    NASA Astrophysics Data System (ADS)

    Li, Yan; Collier, Martin

    2007-11-01

    Wavelength-routed networks have received enormous attention due to the fact that they are relatively simple to implement and implicitly offer Quality of Service (QoS) guarantees. However, they suffer from a bandwidth inefficiency problem and require complex Routing and Wavelength Assignment (RWA). Most attempts to address the above issues exploit the joint use of WDM and TDM technologies. The resultant TDM-based wavelength-routed networks partition the wavelength bandwidth into fixed-length time slots organized as a fixed-length frame. Multiple connections can thus time-share a wavelength and the grooming of their traffic leads to better bandwidth utilization. The capability of switching in both wavelength and time domains in such networks also mitigates the RWA problem. However, TMD-based wavelength-routed networks work in synchronous mode and strict synchronization among all network nodes is required. Global synchronization for all-optical networks which operate at extremely high speed is technically challenging, and deploying an optical synchronizer for each wavelength involves considerable cost. An Optical Slotted Circuit Switching (OSCS) architecture is proposed in this paper. In an OSCS network, slotted circuits are created to better utilize the wavelength bandwidth than in classic wavelength-routed networks. The operation of the protocol is such as to avoid the need for global synchronization required by TDM-based wavelength-routed networks.

  5. A New Black Carbon Sensor for Dense Air Quality Monitoring Networks

    PubMed Central

    Caubel, Julien J.; Cados, Troy E.; Kirchstetter, Thomas W.

    2018-01-01

    Low-cost air pollution sensors are emerging and increasingly being deployed in densely distributed wireless networks that provide more spatial resolution than is typical in traditional monitoring of ambient air quality. However, a low-cost option to measure black carbon (BC)—a major component of particulate matter pollution associated with adverse human health risks—is missing. This paper presents a new BC sensor designed to fill this gap, the Aerosol Black Carbon Detector (ABCD), which incorporates a compact weatherproof enclosure, solar-powered rechargeable battery, and cellular communication to enable long-term, remote operation. This paper also demonstrates a data processing methodology that reduces the ABCD’s sensitivity to ambient temperature fluctuations, and therefore improves measurement performance in unconditioned operating environments (e.g., outdoors). A fleet of over 100 ABCDs was operated outdoors in collocation with a commercial BC instrument (Magee Scientific, Model AE33) housed inside a regulatory air quality monitoring station. The measurement performance of the 105 ABCDs is comparable to the AE33. The fleet-average precision and accuracy, expressed in terms of mean absolute percentage error, are 9.2 ± 0.8% (relative to the fleet average data) and 24.6 ± 0.9% (relative to the AE33 data), respectively (fleet-average ± 90% confidence interval). PMID:29494528

  6. Diagnosis and Prognostic of Wastewater Treatment System Based on Bayesian Network

    NASA Astrophysics Data System (ADS)

    Li, Dan; Yang, Haizhen; Liang, XiaoFeng

    2010-11-01

    Wastewater treatment is a complicated and dynamic process. The treatment effect can be influenced by many variables in microbial, chemical and physical aspects. These variables are always uncertain. Due to the complex biological reaction mechanisms, the highly time-varying and multivariable aspects, the diagnosis and prognostic of wastewater treatment system are still difficult in practice. Bayesian network (BN) is one of the best methods for dealing with uncertainty in the artificial intelligence field. Because of the powerful inference ability and convenient decision mechanism, BN can be employed into the model description and influencing factor analysis of wastewater treatment system with great flexibility and applicability.In this paper, taking modified sequencing batch reactor (MSBR) as an analysis object, BN model was constructed according to the influent water quality, operational condition and effluent effect data of MSBR, and then a novel approach based on BN is proposed to analyze the influencing factors of the wastewater treatment system. The approach presented gives an effective tool for diagnosing and predicting analysis of the wastewater treatment system. On the basis of the influent water quality and operational condition, effluent effect can be predicted. Moreover, according to the effluent effect, the influent water quality and operational condition also can be deduced.

  7. A New Black Carbon Sensor for Dense Air Quality Monitoring Networks.

    PubMed

    Caubel, Julien J; Cados, Troy E; Kirchstetter, Thomas W

    2018-03-01

    Low-cost air pollution sensors are emerging and increasingly being deployed in densely distributed wireless networks that provide more spatial resolution than is typical in traditional monitoring of ambient air quality. However, a low-cost option to measure black carbon (BC)-a major component of particulate matter pollution associated with adverse human health risks-is missing. This paper presents a new BC sensor designed to fill this gap, the Aerosol Black Carbon Detector (ABCD), which incorporates a compact weatherproof enclosure, solar-powered rechargeable battery, and cellular communication to enable long-term, remote operation. This paper also demonstrates a data processing methodology that reduces the ABCD's sensitivity to ambient temperature fluctuations, and therefore improves measurement performance in unconditioned operating environments (e.g., outdoors). A fleet of over 100 ABCDs was operated outdoors in collocation with a commercial BC instrument (Magee Scientific, Model AE33) housed inside a regulatory air quality monitoring station. The measurement performance of the 105 ABCDs is comparable to the AE33. The fleet-average precision and accuracy, expressed in terms of mean absolute percentage error, are 9.2 ± 0.8% (relative to the fleet average data) and 24.6 ± 0.9% (relative to the AE33 data), respectively (fleet-average ± 90% confidence interval).

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Law, B E

    Research involves analysis and field direction of AmeriFlux operations, and the PI provides scientific leadership of the AmeriFlux network. Activities include the coordination and quality assurance of measurements across AmeriFlux network sites, synthesis of results across the network, organizing and supporting the annual Science Team Meeting, and communicating AmeriFlux results to the scientific community and other users. Objectives of measurement research include (i) coordination of flux and biometric measurement protocols (ii) timely data delivery to the Carbon Dioxide Information and Analysis Center (CDIAC); and (iii) assurance of data quality of flux and ecosystem measurements contributed by AmeriFlux sites. Objectives ofmore » integration and synthesis activities include (i) integration of site data into network-wide synthesis products; and (ii) participation in the analysis, modeling and interpretation of network data products. Communications objectives include (i) organizing an annual meeting of AmeriFlux investigators for reporting annual flux measurements and exchanging scientific information on ecosystem carbon budgets; (ii) developing focused topics for analysis and publication; and (iii) developing data reporting protocols in support of AmeriFlux network goals.« less

  9. External quality-assurance results for the National Atmospheric Deposition Program and the National Trends Network during 1986

    USGS Publications Warehouse

    See, Randolph B.; Schroder, LeRoy J.; Willoughby, Timothy C.

    1988-01-01

    During 1986, the U.S. Geological Survey operated three programs to provide external quality-assurance monitoring of the National Atmospheric Deposition Program and National Trends Network. An intersite-comparison program was used to assess the accuracy of onsite pH and specific-conductance determinations at quarterly intervals. The blind-audit program was used to assess the effect of routine sample handling on the precision and bias of program and network wet-deposition data. Analytical results from four laboratories, which routinely analyze wet-deposition samples, were examined to determine if differences existed between laboratory analytical results and to provide estimates of the analytical precision of each laboratory. An average of 78 and 89 percent of the site operators participating in the intersite-comparison met the network goals for pH and specific conductance. A comparison of analytical values versus actual values for samples submitted as part of the blind-audit program indicated that analytical values were slightly but significantly (a = 0.01) larger than actual values for pH, magnesium, sodium, and sulfate; analytical values for specific conductance were slightly less than actual values. The decreased precision in the analyses of blind-audit samples when compared to interlaboratory studies indicates that a large amount of uncertainty in network deposition data may be a result of routine field operations. The results of the interlaboratory comparison study indicated that the magnitude of the difference between laboratory analyses was small for all analytes. Analyses of deionized, distilled water blanks by participating laboratories indicated that the laboratories had difficulty measuring analyte concentrations near their reported detection limits. (USGS)

  10. Agent-Based Framework for Personalized Service Provisioning in Converged IP Networks

    NASA Astrophysics Data System (ADS)

    Podobnik, Vedran; Matijasevic, Maja; Lovrek, Ignac; Skorin-Kapov, Lea; Desic, Sasa

    In a global multi-service and multi-provider market, the Internet Service Providers will increasingly need to differentiate in the service quality they offer and base their operation on new, consumer-centric business models. In this paper, we propose an agent-based framework for the Business-to-Consumer (B2C) electronic market, comprising the Consumer Agents, Broker Agents and Content Agents, which enable Internet consumers to select a content provider in an automated manner. We also discuss how to dynamically allocate network resources to provide end-to-end Quality of Service (QoS) for a given consumer and content provider.

  11. Investigation on trophic state index by artificial neural networks (case study: Dez Dam of Iran)

    NASA Astrophysics Data System (ADS)

    Saghi, H.; Karimi, L.; Javid, A. H.

    2015-06-01

    Dam construction and surface runoff control is one of the most common approaches for water-needs supply of human societies. However, the increasing development of social activities and hence the subsequent increase in environmental pollutants leads to deterioration of water quality in dam reservoirs and eutrophication process could be intensified. So, the water quality of reservoirs is now one of the key factors in operation and water quality management of reservoirs. Hence, maintaining the quality of the stored water and identification and examination of changes along time has been a constant concern of humans that involves the water authorities. Traditionally, empirical trophic state indices of dam reservoirs often defined based on changes in concentration of effective factors (nutrients) and its consequences (increase in chlorophyll a), have been used as an efficient tool in the definition of dam reservoirs quality. In recent years, modeling techniques such as artificial neural networks have enhanced the prediction capability and the accuracy of these studies. In this study, artificial neural networks have been applied to analyze eutrophication process in the Dez Dam reservoir in Iran. In this paper, feed forward neural network with one input layer, one hidden layer and one output layer was applied using MATLAB neural network toolbox for trophic state index (TSI) analysis in the Dez Dam reservoir. The input data of this network are effective parameters in the eutrophication: nitrogen cycle parameters and phosphorous cycle parameters and parameters that will be changed by eutrophication: Chl a, SD, DO and the output data is TSI. Based on the results from estimation of modified Carlson trophic state index, Dez Dam reservoir is considered to be eutrophic in the early July to mid-November and would be mesotrophic with decrease in temperature. Therefore, a decrease in water quality of the dam reservoir during the warm seasons is expectable. The results indicated that artificial neural network (ANN) is a suitable tool for quality modeling of reservoir of dam and increment and decrement of nutrients in trend of eutrophication. Therefore, ANN is a suitable tool for quality modeling of reservoir of dam.

  12. Impact of High Power Interference Sources in Planning and Deployment of Wireless Sensor Networks and Devices in the 2.4 GHz Frequency Band in Heterogeneous Environments

    PubMed Central

    Iturri, Peio López; Nazábal, Juan Antonio; Azpilicueta, Leire; Rodriguez, Pablo; Beruete, Miguel; Fernández-Valdivielso, Carlos; Falcone, Francisco

    2012-01-01

    In this work, the impact of radiofrequency radiation leakage from microwave ovens and its effect on 802.15.4 ZigBee-compliant wireless sensor networks operating in the 2.4 GHz Industrial Scientific Medical (ISM) band is analyzed. By means of a novel radioplanning approach, based on electromagnetic field simulation of a microwave oven and determination of equivalent radiation sources applied to an in-house developed 3D ray launching algorithm, estimation of the microwave oven's power leakage is obtained for the complete volume of an indoor scenario. The magnitude and the variable nature of the interference is analyzed and the impact in the radio link quality in operating wireless sensors is estimated and compared with radio channel measurements as well as packet measurements. The measurement results reveal the importance of selecting an adequate 802.15.4 channel, as well as the Wireless Sensor Network deployment strategy within this type of environment, in order to optimize energy consumption and increase the overall network performance. The proposed method enables one to estimate potential interference effects in devices operating within the 2.4 GHz band in the complete scenario, prior to wireless sensor network deployment, which can aid in achieving the most optimal network topology. PMID:23202228

  13. A Crowdsensing Based Analytical Framework for Perceptional Degradation of OTT Web Browsing.

    PubMed

    Li, Ke; Wang, Hai; Xu, Xiaolong; Du, Yu; Liu, Yuansheng; Ahmad, M Omair

    2018-05-15

    Service perception analysis is crucial for understanding both user experiences and network quality as well as for maintaining and optimizing of mobile networks. Given the rapid development of mobile Internet and over-the-top (OTT) services, the conventional network-centric mode of network operation and maintenance is no longer effective. Therefore, developing an approach to evaluate and optimizing users' service perceptions has become increasingly important. Meanwhile, the development of a new sensing paradigm, mobile crowdsensing (MCS), makes it possible to evaluate and analyze the user's OTT service perception from end-user's point of view other than from the network side. In this paper, the key factors that impact users' end-to-end OTT web browsing service perception are analyzed by monitoring crowdsourced user perceptions. The intrinsic relationships among the key factors and the interactions between key quality indicators (KQI) are evaluated from several perspectives. Moreover, an analytical framework of perceptional degradation and a detailed algorithm are proposed whose goal is to identify the major factors that impact the perceptional degradation of web browsing service as well as their significance of contribution. Finally, a case study is presented to show the effectiveness of the proposed method using a dataset crowdsensed from a large number of smartphone users in a real mobile network. The proposed analytical framework forms a valuable solution for mobile network maintenance and optimization and can help improve web browsing service perception and network quality.

  14. STBC AF relay for unmanned aircraft system

    NASA Astrophysics Data System (ADS)

    Adachi, Fumiyuki; Miyazaki, Hiroyuki; Endo, Chikara

    2015-01-01

    If a large scale disaster similar to the Great East Japan Earthquake 2011 happens, some areas may be isolated from the communications network. Recently, unmanned aircraft system (UAS) based wireless relay communication has been attracting much attention since it is able to quickly re-establish the connection between isolated areas and the network. However, the channel between ground station (GS) and unmanned aircraft (UA) is unreliable due to UA's swing motion and as consequence, the relay communication quality degrades. In this paper, we introduce space-time block coded (STBC) amplify-and-forward (AF) relay for UAS based wireless relay communication to improve relay communication quality. A group of UAs forms single frequency network (SFN) to perform STBC-AF cooperative relay. In STBC-AF relay, only conjugate operation, block exchange and amplifying are required at UAs. Therefore, STBC-AF relay improves the relay communication quality while alleviating the complexity problem at UAs. It is shown by computer simulation that STBC-AF relay can achieve better throughput performance than conventional AF relay.

  15. Twenty years of measurement of polycyclic aromatic hydrocarbons (PAHs) in UK ambient air by nationwide air quality networks.

    PubMed

    Brown, Andrew S; Brown, Richard J C; Coleman, Peter J; Conolly, Christopher; Sweetman, Andrew J; Jones, Kevin C; Butterfield, David M; Sarantaridis, Dimitris; Donovan, Brian J; Roberts, Ian

    2013-06-01

    The impact of human activities on the health of the population and of the wider environment has prompted action to monitor the presence of toxic compounds in the atmosphere. Toxic organic micropollutants (TOMPs) are some of the most insidious and persistent of these pollutants. Since 1991 the United Kingdom has operated nationwide air quality networks to assess the presence of TOMPs, including polycyclic aromatic hydrocarbons (PAHs), in ambient air. The data produced in 2010 marked 20 years of nationwide PAH monitoring. This paper marks this milestone by providing a novel and critical review of the data produced since nationwide monitoring began up to the end of 2011 (the latest year for which published data is available), discussing how the networks performing this monitoring has evolved, and elucidating trends in the concentrations of the PAHs measured. The current challenges in the area and a forward look to the future of air quality monitoring for PAHs are also discussed briefly.

  16. The GSN Data Quality Initiative

    NASA Astrophysics Data System (ADS)

    Davis, J. P.; Anderson, K. R.; Gee, L. S.

    2010-12-01

    The Global Seismographic Network (GSN) is undertaking a renewed effort to assess and assure data quality that builds upon completion of the major installation phase of the GSN and recent funding to recapitalize most of the network’s equipment including data acquisition systems, ancillary equipment and secondary sensors. We highlight here work by the network operators, the USGS’ Albuquerque Seismological Lab and UCSD’s Project IDA, to ensure that both the quality of the waveforms collected is maximized, that the published metadata accurately reflect the instrument response of the data acquisitions systems, and that data users are informed of the status of the GSN data quality. Procedures to evaluate waveform quality blend tools made available through the IRIS DMC’s Quality Analysis Control Kit (http://www.iris.washington.edu/QUACK/), analysis results provided by the Lamont Waveform Quality Center (www.ldeo.columbia.edu/~ekstrom/Projects/WQC.html), and custom software developed by each of the operators to identify and track known hardware failure modes. Each operator’s equipment upgrade schedule is updated periodically to address sensors identified as failing or problematic and for which replacements are available. Particular attention is also paid to monitoring the GPS clock signal to guarantee that the data are timed properly. Devices based on GPS technology unavailable when the GSN began 25 years ago are being integrated into operations to verify sensor orientations. Portable, broadband seismometers whose stable response can be verified in the laboratory are now co-located with GSN sensors during field visits to verify the existing GSN sensors’ sensitivity. Additional effort is being made to analyze past calibration signals and to check the system response functions of the secondary broadband sensors at GSN sites. The new generation of data acquisition systems will enable relative calibrations to be performed more frequently than was possible in the past. Additional details of this effort can be found at the GSN Quality webpage (www.iris.edu/hq/programs/gsn/quality).

  17. Environmental and Water Quality Operational Studies. General Guidelines for Monitoring Contaminants in Reservoirs

    DTIC Science & Technology

    1986-02-01

    espacially trte for the topics of sampling and analytical methods, statistical considerations, and the design of general water quality monitoring networks. For...and to the establishment and habitat differentiation of biological populations within reservoirs. Reservoir operatirn, esp- cially the timing...8217 % - - % properties of bottom sediments, as well as specific habitat associations of biological populations of reservoirs. Thus, such heterogeneities

  18. UMA/GAN network architecture analysis

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi

    2009-07-01

    This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.

  19. The cost of service quality improvements: tracking the flow of funds in social franchise networks in Myanmar

    PubMed Central

    2013-01-01

    Introduction This paper examines the cost of quality improvements in Population Services International (PSI) Myanmar’s social franchise operations from 2007 to 2009. Methods The social franchise commodities studied were products for reproductive health, malaria, STIs, pneumonia, and diarrhea. This project applied ingredients based costing for labor, supplies, transport, and overhead. Data were gathered seven during key informant interviews with staff in the central Yangon office, examination of 3 years of payroll data, examination of a time motion study conducted by PSI, and spreadsheets recording the costs of acquiring and transporting supplies. Results In 2009 PSI Myanmar’s social franchise devoted $2.02 million towards a 94% reduction in commodity prices offered to its network of over 1700 primary care providers. These providers retained 1/3 of the subsidy as revenue and passed along the other 2/3 to their patients in the course of offering subsidized care for 1.5 million health episodes. In addition, PSI Myanmar devoted $2.09 million to support a team of franchise officers who conducted quality assurance for the private providers overseeing service quality and to distributing medical commodities. Conclusion In Myanmar, the social franchise operated by PSI spends roughly $1.00 in quality management and retailing for every $1.00 spent subsidizing medical commodities. Some services are free, but patients also pay fees for other lines of service. Overall patients contribute 1/6 as much as PSI does. Unlike other NGO’s, health services in social franchises like PSI are not all free to the patients, nor are the discounts uniformly applied. Discounts and subsidies evolve in response to public health concerns, market demand, providers’ cost structures as well as strategic objectives in maintaining the network and its portfolio of services. PMID:23826743

  20. The cost of service quality improvements: tracking the flow of funds in social franchise networks in Myanmar.

    PubMed

    Bishai, David; LeFevre, Amnesty; Theuss, Marc; Boxshall, Matt; Hetherington, John D; Zaw, Min; Montagu, Dominic

    2013-01-01

    This paper examines the cost of quality improvements in Population Services International (PSI) Myanmar's social franchise operations from 2007 to 2009. The social franchise commodities studied were products for reproductive health, malaria, STIs, pneumonia, and diarrhea. This project applied ingredients based costing for labor, supplies, transport, and overhead. Data were gathered seven during key informant interviews with staff in the central Yangon office, examination of 3 years of payroll data, examination of a time motion study conducted by PSI, and spreadsheets recording the costs of acquiring and transporting supplies. In 2009 PSI Myanmar's social franchise devoted $2.02 million towards a 94% reduction in commodity prices offered to its network of over 1700 primary care providers. These providers retained 1/3 of the subsidy as revenue and passed along the other 2/3 to their patients in the course of offering subsidized care for 1.5 million health episodes. In addition, PSI Myanmar devoted $2.09 million to support a team of franchise officers who conducted quality assurance for the private providers overseeing service quality and to distributing medical commodities. In Myanmar, the social franchise operated by PSI spends roughly $1.00 in quality management and retailing for every $1.00 spent subsidizing medical commodities. Some services are free, but patients also pay fees for other lines of service. Overall patients contribute 1/6 as much as PSI does. Unlike other NGO's, health services in social franchises like PSI are not all free to the patients, nor are the discounts uniformly applied. Discounts and subsidies evolve in response to public health concerns, market demand, providers' cost structures as well as strategic objectives in maintaining the network and its portfolio of services.

  1. Development and validation of a survey to measure features of clinical networks.

    PubMed

    Brown, Bernadette Bea; Haines, Mary; Middleton, Sandy; Paul, Christine; D'Este, Catherine; Klineberg, Emily; Elliott, Elizabeth

    2016-09-30

    Networks of clinical experts are increasingly being implemented as a strategy to improve health care processes and outcomes and achieve change in the health system. Few are ever formally evaluated and, when this is done, not all networks are equally successful in their efforts. There is a need to formatively assess the strategic and operational management and leadership of networks to identify where functioning could be improved to maximise impact. This paper outlines the development and psychometric evaluation of an Internet survey to measure features of clinical networks and provides descriptive results from a sample of members of 19 diverse clinical networks responsible for evidence-based quality improvement across a large geographical region. Instrument development was based on: a review of published and grey literature; a qualitative study of clinical network members; a program logic framework; and consultation with stakeholders. The resulting domain structure was validated for a sample of 592 clinical network members using confirmatory factor analysis. Scale reliability was assessed using Cronbach's alpha. A summary score was calculated for each domain and aggregate level means and ranges are reported. The instrument was shown to have good construct validity across seven domains as demonstrated by a high level of internal consistency, and all Cronbach's α coefficients were equal to or above 0.75. In the survey sample of network members there was strong reported commitment and belief in network-led quality improvement initiatives, which were perceived to have improved quality of care (72.8 %) and patient outcomes (63.2 %). Network managers were perceived to be effective leaders and clinical co-chairs were perceived as champions for change. Perceived external support had the lowest summary score across the seven domains. This survey, which has good construct validity and internal reliability, provides a valid instrument to use in future research related to clinical networks. The survey will be of use to health service managers to identify strengths and areas where networks can be improved to increase effectiveness and impact on quality of care and patient outcomes. Equally, the survey could be adapted for use in the assessment of other types of networks.

  2. 40 CFR 63.9632 - What are the installation, operation, and maintenance requirements for my monitoring equipment?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... document is available on the EPA's Technology Transfer Network at http://www.epa.gov/ttn/emc/cem/tribo.pdf... alignment of each COMS. (3) You must operate and maintain each COMS according to § 63.8(e) and your quality... alignment audit. (4) You must determine and record the 6-minute average opacity for periods during which the...

  3. 40 CFR 63.9632 - What are the installation, operation, and maintenance requirements for my monitoring equipment?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... document is available on the EPA's Technology Transfer Network at http://www.epa.gov/ttn/emc/cem/tribo.pdf... alignment of each COMS. (3) You must operate and maintain each COMS according to § 63.8(e) and your quality... alignment audit. (4) You must determine and record the 6-minute average opacity for periods during which the...

  4. 40 CFR 63.9632 - What are the installation, operation, and maintenance requirements for my monitoring equipment?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... document is available on the EPA's Technology Transfer Network at http://www.epa.gov/ttn/emc/cem/tribo.pdf... alignment of each COMS. (3) You must operate and maintain each COMS according to § 63.8(e) and your quality... alignment audit. (4) You must determine and record the 6-minute average opacity for periods during which the...

  5. Explore the impacts of river flow and quality on biodiversity for water resources management by AI techniques

    NASA Astrophysics Data System (ADS)

    Chang, Fi-John; Tsai Tsai, Wen-Ping; Chang, Li-Chiu

    2016-04-01

    Water resources development is very challenging in Taiwan due to her diverse geographic environment and climatic conditions. To pursue sustainable water resources development, rationality and integrity is essential for water resources planning. River water quality and flow regimes are closely related to each other and affect river ecosystems simultaneously. This study aims to explore the complex impacts of water quality and flow regimes on fish community in order to comprehend the situations of the eco-hydrological system in the Danshui River of northern Taiwan. To make an effective and comprehensive strategy for sustainable water resources management, this study first models fish diversity through implementing a hybrid artificial neural network (ANN) based on long-term observational heterogeneity data of water quality, stream flow and fish species in the river. Then we use stream flow to estimate the loss of dissolved oxygen based on back-propagation neural networks (BPNNs). Finally, the non-dominated sorting genetic algorithm II (NSGA-II) is established for river flow management over the Shihmen Reservoir which is the main reservoir in this study area. In addition to satisfying the water demands of human beings and ecosystems, we also consider water quality for river flow management. The ecosystem requirement takes the form of maximizing fish diversity, which can be estimated by the hybrid ANN. The human requirement is to provide a higher satisfaction degree of water supply while the water quality requirement is to reduce the loss of dissolved oxygen in the river among flow stations. The results demonstrate that the proposed methodology can offer diversified alternative strategies for reservoir operation and improve reservoir operation strategies for producing downstream flows that could better meet both human and ecosystem needs as well as maintain river water quality. Keywords: Artificial intelligence (AI), Artificial neural networks (ANNs), Non-dominated sorting genetic algorithm II (NSGA-II), Sustainable water resources management, Flow regime, River ecosystem.

  6. Quality of service provision assessment in the healthcare information and telecommunications infrastructures.

    PubMed

    Babulak, Eduard

    2006-01-01

    The continuous increase in the complexity and the heterogeneity of corporate and healthcare telecommunications infrastructures will require new assessment methods of quality of service (QoS) provision that are capable of addressing all engineering and social issues with much faster speeds. Speed and accessibility to any information at any time from anywhere will create global communications infrastructures with great performance bottlenecks that may put in danger human lives, power supplies, national economy and security. Regardless of the technology supporting the information flows, the final verdict on the QoS is made by the end user. The users' perception of telecommunications' network infrastructure QoS provision is critical to the successful business management operation of any organization. As a result, it is essential to assess the QoS Provision in the light of user's perception. This article presents a cost effective methodology to assess the user's perception of quality of service provision utilizing the existing Staffordshire University Network (SUN) by adding a component of measurement to the existing model presented by Walker. This paper presents the real examples of CISCO Networking Solutions for Health Care givers and offers a cost effective approach to assess the QoS provision within the campus network, which could be easily adapted to any health care organization or campus network in the world.

  7. Capacity and reliability analyses with applications to power quality

    NASA Astrophysics Data System (ADS)

    Azam, Mohammad; Tu, Fang; Shlapak, Yuri; Kirubarajan, Thiagalingam; Pattipati, Krishna R.; Karanam, Rajaiah

    2001-07-01

    The deregulation of energy markets, the ongoing advances in communication networks, the proliferation of intelligent metering and protective power devices, and the standardization of software/hardware interfaces are creating a dramatic shift in the way facilities acquire and utilize information about their power usage. The currently available power management systems gather a vast amount of information in the form of power usage, voltages, currents, and their time-dependent waveforms from a variety of devices (for example, circuit breakers, transformers, energy and power quality meters, protective relays, programmable logic controllers, motor control centers). What is lacking is an information processing and decision support infrastructure to harness this voluminous information into usable operational and management knowledge to handle the health of their equipment and power quality, minimize downtime and outages, and to optimize operations to improve productivity. This paper considers the problem of evaluating the capacity and reliability analyses of power systems with very high availability requirements (e.g., systems providing energy to data centers and communication networks with desired availability of up to 0.9999999). The real-time capacity and margin analysis helps operators to plan for additional loads and to schedule repair/replacement activities. The reliability analysis, based on computationally efficient sum of disjoint products, enables analysts to decide the optimum levels of redundancy, aids operators in prioritizing the maintenance options for a given budget and monitoring the system for capacity margin. The resulting analytical and software tool is demonstrated on a sample data center.

  8. The Atmospheric Mercury Network: measurement and initial examination of an ongoing atmospheric mercury record across North America

    NASA Astrophysics Data System (ADS)

    Gay, D. A.; Schmeltz, D.; Prestbo, E.; Olson, M.; Sharac, T.; Tordon, R.

    2013-04-01

    The National Atmospheric Deposition Program (NADP) developed and operates a collaborative network of atmospheric mercury monitoring sites based in North America - the Atmospheric Mercury Network (AMNet). The justification for the network was growing interest and demand from many scientists and policy makers for a robust database of measurements to improve model development, assess policies and programs, and improve estimates of mercury dry deposition. Many different agencies and groups support the network, including federal, state, tribal, and international governments, academic institutions, and private companies. AMNet has added two high elevation sites outside of continental North America in Hawaii and Taiwan because of new partnerships forged within NADP. Network sites measure concentrations of atmospheric mercury fractions using automated, continuous mercury speciation systems. The procedures that NADP developed for field operations, data management, and quality assurance ensure that the network makes scientifically valid and consistent measurements. AMNet reports concentrations of hourly gaseous elemental mercury (GEM), two-hour gaseous oxidized mercury (GOM), and two-hour particulate-bound mercury less than 2.5 microns in size (PBM2.5). As of January 2012, over 450 000 valid observations are available from 30 stations. The AMNet also collects ancillary meteorological data and information on land-use and vegetation, when available. We present atmospheric mercury data comparisons by time (3 yr) at 22 unique site locations. Highlighted are contrasting values for site locations across the network: urban versus rural, coastal versus high-elevation and the range of maximum observations. The data presented should catalyze the formation of many scientific questions that may be answered through further in-depth analysis and modeling studies of the AMNet database. All data and methods are publically available through an online database on the NADP website (http://nadp.isws.illinois.edu/amn/). Future network directions are to foster new network partnerships and continue to collect, quality assure, and post data, including dry deposition estimates, for each fraction.

  9. The Atmospheric Mercury Network: measurement and initial examination of an ongoing atmospheric mercury record across North America

    NASA Astrophysics Data System (ADS)

    Gay, D. A.; Schmeltz, D.; Prestbo, E.; Olson, M.; Sharac, T.; Tordon, R.

    2013-11-01

    The National Atmospheric Deposition Program (NADP) developed and operates a collaborative network of atmospheric-mercury-monitoring sites based in North America - the Atmospheric Mercury Network (AMNet). The justification for the network was growing interest and demand from many scientists and policy makers for a robust database of measurements to improve model development, assess policies and programs, and improve estimates of mercury dry deposition. Many different agencies and groups support the network, including federal, state, tribal, and international governments, academic institutions, and private companies. AMNet has added two high-elevation sites outside of continental North America in Hawaii and Taiwan because of new partnerships forged within NADP. Network sites measure concentrations of atmospheric mercury fractions using automated, continuous mercury speciation systems. The procedures that NADP developed for field operations, data management, and quality assurance ensure that the network makes scientifically valid and consistent measurements. AMNet reports concentrations of hourly gaseous elemental mercury (GEM), two-hour gaseous oxidized mercury (GOM), and two-hour particulate-bound mercury less than 2.5 microns in size (PBM2.5). As of January 2012, over 450 000 valid observations are available from 30 stations. AMNet also collects ancillary meteorological data and information on land use and vegetation, when available. We present atmospheric mercury data comparisons by time (3 yr) at 21 individual sites and instruments. Highlighted are contrasting values for site locations across the network: urban versus rural, coastal versus high elevation and the range of maximum observations. The data presented should catalyze the formation of many scientific questions that may be answered through further in-depth analysis and modeling studies of the AMNet database. All data and methods are publically available through an online database on the NADP website (http://nadp.sws.uiuc.edu/amn/). Future network directions are to foster new network partnerships and continue to collect, quality assure, and post data, including dry deposition estimates, for each fraction.

  10. Active and Reactive Power Optimal Dispatch Associated with Load and DG Uncertainties in Active Distribution Network

    NASA Astrophysics Data System (ADS)

    Gao, F.; Song, X. H.; Zhang, Y.; Li, J. F.; Zhao, S. S.; Ma, W. Q.; Jia, Z. Y.

    2017-05-01

    In order to reduce the adverse effects of uncertainty on optimal dispatch in active distribution network, an optimal dispatch model based on chance-constrained programming is proposed in this paper. In this model, the active and reactive power of DG can be dispatched at the aim of reducing the operating cost. The effect of operation strategy on the cost can be reflected in the objective which contains the cost of network loss, DG curtailment, DG reactive power ancillary service, and power quality compensation. At the same time, the probabilistic constraints can reflect the operation risk degree. Then the optimal dispatch model is simplified as a series of single stage model which can avoid large variable dimension and improve the convergence speed. And the single stage model is solved using a combination of particle swarm optimization (PSO) and point estimate method (PEM). Finally, the proposed optimal dispatch model and method is verified by the IEEE33 test system.

  11. Achieving QoS for Aeronautical Telecommunication Networks Over Differentiated Services

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivanic, William

    2001-01-01

    Aeronautical Telecommunication Network (ATN) has been developed by the International Civil Aviation Organization to integrate Air-Ground and Ground-Ground data communication for aeronautical applications into a single network serving Air Traffic Control and Aeronautical Operational Communications. To carry time critical information required for aeronautical applications, ATN provides different Quality of Services (QoS) to applications. ATN has therefore, been designed as a stand alone network which implies building an expensive separate network for ATN However, the cost of operating ATN can be reduced if it can be run over a public network such as the Internet. Although the current Internet does not provide QoS the next generation Internet is expected to provide QoS to applications. The objective of this paper is to investigate the possibility of providing QoS to ATN applications when it is run over the next generation Internet. Differentiated Services (DiffServ), one of the protocols proposed for the next generation Internet, will allow network service providers to offer different QoS to customers. Our results show that it is possible to provide QoS to ATN applications when they run over a DiffServ backbone.

  12. Achieving QoS for Aeronautical Telecommunication Networks over Differentiated Services

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Aeronautical Telecommunication Network (ATN) has been developed by the International Civil Aviation Organization to integrate Air-Ground and Ground-Ground data communication for aeronautical applications into a single network serving Air Traffic Control and Aeronautical Operational Communications. To carry time critical information required for aeronautical applications, ATN provides different Quality of Services (QoS) to applications. ATN has therefore, been designed as a standalone network which implies building an expensive separate network for ATN. However, the cost of operating ATN can be reduced if it can be run over a public network such as the Internet. Although the current Internet does not provide QoS, the next generation Internet is expected to provide QoS to applications. The objective of this paper is to investigate the possibility of providing QoS to ATN applications when it is run over the next generation Internet. Differentiated Services (DiffServ), one of the protocols proposed for the next generation Internet, will allow network service providers to offer different QoS to customers. Our results show that it is possible to provide QoS to ATN applications when they run over a DiffServ backbone.

  13. Flight Hardware Fabricated for Combustion Science in Space

    NASA Technical Reports Server (NTRS)

    OMalley, Terence F.; Weiland, Karen J.

    2005-01-01

    NASA Glenn Research Center s Telescience Support Center (TSC) allows researchers on Earth to operate experiments onboard the International Space Station (ISS) and the space shuttles. NASA s continuing investment in the required software, systems, and networks provides distributed ISS ground operations that enable payload developers and scientists to monitor and control their experiments from the Glenn TSC. The quality of scientific and engineering data is enhanced while the long-term operational costs of experiments are reduced because principal investigators and engineering teams can operate their payloads from their home institutions.

  14. 75 FR 69671 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ... behavior. All 147 networked crisis centers will complete the Web-based Crisis Center Survey annually. The Survey requests information about organizational structure, staffing, scope of services, call center operations, quality assurance, community outreach/marketing, telephone equipment, data collection, and...

  15. Data on the key performance indicators for quality of service of GSM networks in Nigeria.

    PubMed

    Popoola, Segun I; Atayero, Aderemi A; Faruk, Nasir; Badejo, Joke A

    2018-02-01

    In this data article, the Key Performance Indicators (KPIs) for Quality of Service (QoS) of Global System for Mobile Communications (GSM) networks in Nigeria are provided and analyzed. The data provided in this paper contain the Call Setup Success Rate (CSSR), Drop Call Rate (DCR), Stand-alone Dedicated Channel (SDCCH) congestion, and Traffic Channel (TCH) congestion for the four GSM network operators in Nigeria (Airtel, Etisalat, Glo, and MTN). These comprehensive data were obtained from the Nigerian Communications Commission (NCC). Significant differences in each of the KPIs for the four quarters of each year were presented based on Analysis of Variance (ANOVA). The values of the KPIs were plotted against the months of the year for better visualization and understanding of data trends across the four quarters. Multiple comparisons of the mean-quarterly differences of the KPIs were also presented using Tukey's Post Hoc test. Public availability and further interpretation and discussion of these useful information will assist the network providers, Nigerian government, local and international regulatory bodies, policy makers, and other stakeholders in ensuring access of people, machines, and things to high quality telecommunications services.

  16. Evaluation of Bridges Subjected to Military Loading and Dynamic Hydraulic Effects: Review of Design Regulations, Selection Criteria, and Inspection Procedures for Bridge Railing Systems

    DTIC Science & Technology

    2011-08-01

    Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any...may be required by the United States Customs and Immigration Services, in connection with the operation of an international bridge or tunnel. ERDC...significant effect on the operation , service quality, and safety of road networks by restricting the traffic volume and vehicle weight that can be

  17. Pricing Resources in LTE Networks through Multiobjective Optimization

    PubMed Central

    Lai, Yung-Liang; Jiang, Jehn-Ruey

    2014-01-01

    The LTE technology offers versatile mobile services that use different numbers of resources. This enables operators to provide subscribers or users with differential quality of service (QoS) to boost their satisfaction. On one hand, LTE operators need to price the resources high for maximizing their profits. On the other hand, pricing also needs to consider user satisfaction with allocated resources and prices to avoid “user churn,” which means subscribers will unsubscribe services due to dissatisfaction with allocated resources or prices. In this paper, we study the pricing resources with profits and satisfaction optimization (PRPSO) problem in the LTE networks, considering the operator profit and subscribers' satisfaction at the same time. The problem is modelled as nonlinear multiobjective optimization with two optimal objectives: (1) maximizing operator profit and (2) maximizing user satisfaction. We propose to solve the problem based on the framework of the NSGA-II. Simulations are conducted for evaluating the proposed solution. PMID:24526889

  18. Pricing resources in LTE networks through multiobjective optimization.

    PubMed

    Lai, Yung-Liang; Jiang, Jehn-Ruey

    2014-01-01

    The LTE technology offers versatile mobile services that use different numbers of resources. This enables operators to provide subscribers or users with differential quality of service (QoS) to boost their satisfaction. On one hand, LTE operators need to price the resources high for maximizing their profits. On the other hand, pricing also needs to consider user satisfaction with allocated resources and prices to avoid "user churn," which means subscribers will unsubscribe services due to dissatisfaction with allocated resources or prices. In this paper, we study the pricing resources with profits and satisfaction optimization (PRPSO) problem in the LTE networks, considering the operator profit and subscribers' satisfaction at the same time. The problem is modelled as nonlinear multiobjective optimization with two optimal objectives: (1) maximizing operator profit and (2) maximizing user satisfaction. We propose to solve the problem based on the framework of the NSGA-II. Simulations are conducted for evaluating the proposed solution.

  19. Glenn's Telescience Support Center Provided Around-the-Clock Operations Support for Space Experiments on the International Space Station

    NASA Technical Reports Server (NTRS)

    Malarik, Diane C.

    2005-01-01

    NASA Glenn Research Center s Telescience Support Center (TSC) allows researchers on Earth to operate experiments onboard the International Space Station (ISS) and the space shuttles. NASA s continuing investment in the required software, systems, and networks provides distributed ISS ground operations that enable payload developers and scientists to monitor and control their experiments from the Glenn TSC. The quality of scientific and engineering data is enhanced while the long-term operational costs of experiments are reduced because principal investigators and engineering teams can operate their payloads from their home institutions.

  20. Integrating Predictive Modeling with Control System Design for Managed Aquifer Recharge and Recovery Applications

    NASA Astrophysics Data System (ADS)

    Drumheller, Z. W.; Regnery, J.; Lee, J. H.; Illangasekare, T. H.; Kitanidis, P. K.; Smits, K. M.

    2014-12-01

    Aquifers around the world show troubling signs of irreversible depletion and seawater intrusion as climate change, population growth, and urbanization led to reduced natural recharge rates and overuse. Scientists and engineers have begun to re-investigate the technology of managed aquifer recharge and recovery (MAR) as a means to increase the reliability of the diminishing and increasingly variable groundwater supply. MAR systems offer the possibility of naturally increasing groundwater storage while improving the quality of impaired water used for recharge. Unfortunately, MAR systems remain wrought with operational challenges related to the quality and quantity of recharged and recovered water stemming from a lack of data-driven, real-time control. Our project seeks to ease the operational challenges of MAR facilities through the implementation of active sensor networks, adaptively calibrated flow and transport models, and simulation-based meta-heuristic control optimization methods. The developed system works by continually collecting hydraulic and water quality data from a sensor network embedded within the aquifer. The data is fed into an inversion algorithm, which calibrates the parameters and initial conditions of a predictive flow and transport model. The calibrated model is passed to a meta-heuristic control optimization algorithm (e.g. genetic algorithm) to execute the simulations and determine the best course of action, i.e., the optimal pumping policy for current aquifer conditions. The optimal pumping policy is manually or autonomously applied. During operation, sensor data are used to assess the accuracy of the optimal prediction and augment the pumping strategy as needed. At laboratory-scale, a small (18"H x 46"L) and an intermediate (6'H x 16'L) two-dimensional synthetic aquifer were constructed and outfitted with sensor networks. Data collection and model inversion components were developed and sensor data were validated by analytical measurements.

  1. Long-term observations of tropospheric particle number size distributions and equivalent black carbon mass concentrations in the German Ultrafine Aerosol Network (GUAN)

    NASA Astrophysics Data System (ADS)

    Birmili, W.; Weinhold, K.; Merkel, M.; Rasch, F.; Sonntag, A.; Wiedensohler, A.; Bastian, S.; Schladitz, A.; Löschau, G.; Cyrys, J.; Pitz, M.; Gu, J.; Kusch, T.; Flentje, H.; Quass, U.; Kaminski, H.; Kuhlbusch, T. A. J.; Meinhardt, F.; Schwerin, A.; Bath, O.; Ries, L.; Wirtz, K.; Fiebig, M.

    2015-11-01

    The German Ultrafine Aerosol Network (GUAN) is a cooperative atmospheric observation network, which aims at improving the scientific understanding of aerosol-related effects in the troposphere. The network addresses research questions dedicated to both, climate and health related effects. GUAN's core activity has been the continuous collection of tropospheric particle number size distributions and black carbon mass concentrations at seventeen observation sites in Germany. These sites cover various environmental settings including urban traffic, urban background, rural background, and Alpine mountains. In association with partner projects, GUAN has implemented a high degree of harmonisation of instrumentation, operating procedures, and data evaluation procedures. The quality of the measurement data is assured by laboratory intercomparisons as well as on-site comparisons with reference instruments. This paper describes the measurement sites, instrumentation, quality assurance and data evaluation procedures in the network as well as the EBAS repository, where the data sets can be obtained (doi:10.5072/guan).

  2. Long-term observations of tropospheric particle number size distributions and equivalent black carbon mass concentrations in the German Ultrafine Aerosol Network (GUAN)

    NASA Astrophysics Data System (ADS)

    Birmili, Wolfram; Weinhold, Kay; Rasch, Fabian; Sonntag, André; Sun, Jia; Merkel, Maik; Wiedensohler, Alfred; Bastian, Susanne; Schladitz, Alexander; Löschau, Gunter; Cyrys, Josef; Pitz, Mike; Gu, Jianwei; Kusch, Thomas; Flentje, Harald; Quass, Ulrich; Kaminski, Heinz; Kuhlbusch, Thomas A. J.; Meinhardt, Frank; Schwerin, Andreas; Bath, Olaf; Ries, Ludwig; Gerwig, Holger; Wirtz, Klaus; Fiebig, Markus

    2016-08-01

    The German Ultrafine Aerosol Network (GUAN) is a cooperative atmospheric observation network, which aims at improving the scientific understanding of aerosol-related effects in the troposphere. The network addresses research questions dedicated to both climate- and health-related effects. GUAN's core activity has been the continuous collection of tropospheric particle number size distributions and black carbon mass concentrations at 17 observation sites in Germany. These sites cover various environmental settings including urban traffic, urban background, rural background, and Alpine mountains. In association with partner projects, GUAN has implemented a high degree of harmonisation of instrumentation, operating procedures, and data evaluation procedures. The quality of the measurement data is assured by laboratory intercomparisons as well as on-site comparisons with reference instruments. This paper describes the measurement sites, instrumentation, quality assurance, and data evaluation procedures in the network as well as the EBAS repository, where the data sets can be obtained (doi:10.5072/guan).

  3. Hierarchy Bayesian model based services awareness of high-speed optical access networks

    NASA Astrophysics Data System (ADS)

    Bai, Hui-feng

    2018-03-01

    As the speed of optical access networks soars with ever increasing multiple services, the service-supporting ability of optical access networks suffers greatly from the shortage of service awareness. Aiming to solve this problem, a hierarchy Bayesian model based services awareness mechanism is proposed for high-speed optical access networks. This approach builds a so-called hierarchy Bayesian model, according to the structure of typical optical access networks. Moreover, the proposed scheme is able to conduct simple services awareness operation in each optical network unit (ONU) and to perform complex services awareness from the whole view of system in optical line terminal (OLT). Simulation results show that the proposed scheme is able to achieve better quality of services (QoS), in terms of packet loss rate and time delay.

  4. Synthesis of recurrent neural networks for dynamical system simulation.

    PubMed

    Trischler, Adam P; D'Eleuterio, Gabriele M T

    2016-08-01

    We review several of the most widely used techniques for training recurrent neural networks to approximate dynamical systems, then describe a novel algorithm for this task. The algorithm is based on an earlier theoretical result that guarantees the quality of the network approximation. We show that a feedforward neural network can be trained on the vector-field representation of a given dynamical system using backpropagation, then recast it as a recurrent network that replicates the original system's dynamics. After detailing this algorithm and its relation to earlier approaches, we present numerical examples that demonstrate its capabilities. One of the distinguishing features of our approach is that both the original dynamical systems and the recurrent networks that simulate them operate in continuous time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A Review of Distributed Control Techniques for Power Quality Improvement in Micro-grids

    NASA Astrophysics Data System (ADS)

    Zeeshan, Hafiz Muhammad Ali; Nisar, Fatima; Hassan, Ahmad

    2017-05-01

    Micro-grid is typically visualized as a small scale local power supply network dependent on distributed energy resources (DERs) that can operate simultaneously with grid as well as in standalone manner. The distributed generator of a micro-grid system is usually a converter-inverter type topology acting as a non-linear load, and injecting harmonics into the distribution feeder. Hence, the negative effects on power quality by the usage of distributed generation sources and components are clearly witnessed. In this paper, a review of distributed control approaches for power quality improvement is presented which encompasses harmonic compensation, loss mitigation and optimum power sharing in multi-source-load distributed power network. The decentralized subsystems for harmonic compensation and active-reactive power sharing accuracy have been analysed in detail. Results have been validated to be consistent with IEEE standards.

  6. KNT-artificial neural network model for flux prediction of ultrafiltration membrane producing drinking water.

    PubMed

    Oh, H K; Yu, M J; Gwon, E M; Koo, J Y; Kim, S G; Koizumi, A

    2004-01-01

    This paper describes the prediction of flux behavior in an ultrafiltration (UF) membrane system using a Kalman neuro training (KNT) network model. The experimental data was obtained from operating a pilot plant of hollow fiber UF membrane with groundwater for 7 months. The network was trained using operating conditions such as inlet pressure, filtration duration, and feed water quality parameters including turbidity, temperature and UV254. Pre-processing of raw data allowed the normalized input data to be used in sigmoid activation functions. A neural network architecture was structured by modifying the number of hidden layers, neurons and learning iterations. The structure of KNT-neural network with 3 layers and 5 neurons allowed a good prediction of permeate flux by 0.997 of correlation coefficient during the learning phase. Also the validity of the designed model was evaluated with other experimental data not used during the training phase and nonlinear flux behavior was accurately estimated with 0.999 of correlation coefficient and a lower error of prediction in the testing phase. This good flux prediction can provide preliminary criteria in membrane design and set up the proper cleaning cycle in membrane operation. The KNT-artificial neural network is also expected to predict the variation of transmembrane pressure during filtration cycles and can be applied to automation and control of full scale treatment plants.

  7. Data analysis-based autonomic bandwidth adjustment in software defined multi-vendor optical transport networks.

    PubMed

    Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Jing, Ruiquan

    2017-11-27

    Network operators generally provide dedicated lightpaths for customers to meet the demand for high-quality transmission. Considering the variation of traffic load, customers usually rent peak bandwidth that exceeds the practical average traffic requirement. In this case, bandwidth provisioning is unmetered and customers have to pay according to peak bandwidth. Supposing that network operators could keep track of traffic load and allocate bandwidth dynamically, bandwidth can be provided as a metered service and customers would pay for the bandwidth that they actually use. To achieve cost-effective bandwidth provisioning, this paper proposes an autonomic bandwidth adjustment scheme based on data analysis of traffic load. The scheme is implemented in a software defined networking (SDN) controller and is demonstrated in the field trial of multi-vendor optical transport networks. The field trial shows that the proposed scheme can track traffic load and realize autonomic bandwidth adjustment. In addition, a simulation experiment is conducted to evaluate the performance of the proposed scheme. We also investigate the impact of different parameters on autonomic bandwidth adjustment. Simulation results show that the step size and adjustment period have significant influences on bandwidth savings and packet loss. A small value of step size and adjustment period can bring more benefits by tracking traffic variation with high accuracy. For network operators, the scheme can serve as technical support of realizing bandwidth as metered service in the future.

  8. Large Scale Data Analytics of User Behavior for Improving Content Delivery

    DTIC Science & Technology

    2014-12-01

    video streaming, web browsing To Achan and Amma iv Abstract The Internet is fast becoming the de facto content delivery network of the world...operators everywhere and they seek to de - sign and manage their networks better to improve content delivery and provide better quality of experience...Anjali, Kriti and Ruta have been great company and have let me partake in their delicious homecooked food more times than I can remember. My friends

  9. Analysis of Online DBA Algorithm with Adaptive Sleep Cycle in WDM EPON

    NASA Astrophysics Data System (ADS)

    Pajčin, Bojan; Matavulj, Petar; Radivojević, Mirjana

    2018-05-01

    In order to manage Quality of Service (QoS) and energy efficiency in the optical access network, an online Dynamic Bandwidth Allocation (DBA) algorithm with adaptive sleep cycle is presented. This DBA algorithm has the ability to allocate an additional bandwidth to the end user within a single sleep cycle whose duration changes depending on the current buffers occupancy. The purpose of this DBA algorithm is to tune the duration of the sleep cycle depending on the network load in order to provide service to the end user without violating strict QoS requests in all network operating conditions.

  10. The influence of utility-interactive PV system characteristics to ac power networks

    NASA Astrophysics Data System (ADS)

    Takeda, Y.; Takigawa, K.; Kaminosono, H.

    Two basic experimental photovoltaic (PV) systems have been built for the study of variation of power quality, aspects of safety, and technical problems. One system uses a line-commutated inverter, while the other system uses a self-commutated inverter. A description is presented of the operating and generating characteristics of the two systems. The systems were connected to an ac simulated network which simulates an actual power distribution system. Attention is given to power generation characteristics, the control characteristics, the harmonics characteristics, aspects of coordination with the power network, and questions regarding the reliability of photovoltaic modules.

  11. Five Years of BEACO2N: First Results and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Shusterman, A.; Cohen, R. C.

    2017-12-01

    The BErkeley Atmospheric CO2 Observation Network (BEACO2N) is an ongoing greenhouse gas and air quality monitoring campaign based in the San Francisco Bay Area of Northern California. BEACO2N is a distributed network instrument consisting of low- to moderate-cost commercial sensors for CO2 and other pollutants installed on top of schools, museums, and other outreach-minded institutions. The reduced cost of each individual sensor "node" enables the deployment of a larger volume of total nodes, resulting in a web of approximately 50 sites with an average node-to-node distance of 2 km. Operating in some variation of this configuration since 2012, BEACO2N offers greater spatio-temporal coverage than any other fixed CO2 monitoring network to date. This high-resolution information allows us to faithfully represent the true heterogeneity of urban emission processes and distinguish between specific sources that are often regulated independently, but typically treated en masse by sparser, conventional surface monitors. However, maintaining and appropriately interpreting a network of BEACO2N's size presents a number of unique data quality and data coverage challenges. Here we describe the quantitative capabilities of the BEACO2N platform, first results from initial attempts at constraining greenhouse gas emission estimates, as well as other lessons learned over the first five years of operation.

  12. Automatic quality assessment of apical four-chamber echocardiograms using deep convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Abdi, Amir H.; Luong, Christina; Tsang, Teresa; Allan, Gregory; Nouranian, Saman; Jue, John; Hawley, Dale; Fleming, Sarah; Gin, Ken; Swift, Jody; Rohling, Robert; Abolmaesumi, Purang

    2017-02-01

    Echocardiography (echo) is the most common test for diagnosis and management of patients with cardiac condi- tions. While most medical imaging modalities benefit from a relatively automated procedure, this is not the case for echo and the quality of the final echo view depends on the competency and experience of the sonographer. It is not uncommon that the sonographer does not have adequate experience to adjust the transducer and acquire a high quality echo, which may further affect the clinical diagnosis. In this work, we aim to aid the operator during image acquisition by automatically assessing the quality of the echo and generating the Automatic Echo Score (AES). This quality assessment method is based on a deep convolutional neural network, trained in an end-to-end fashion on a large dataset of apical four-chamber (A4C) echo images. For this project, an expert car- diologist went through 2,904 A4C images obtained from independent studies and assessed their condition based on a 6-scale grading system. The scores assigned by the expert ranged from 0 to 5. The distribution of scores among the 6 levels were almost uniform. The network was then trained on 80% of the data (2,345 samples). The average absolute error of the trained model in calculating the AES was 0.8 +/- 0:72. The computation time of the GPU implementation of the neural network was estimated at 5 ms per frame, which is sufficient for real-time deployment.

  13. 13 CFR 130.330 - Operating requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... assessment and evaluation, and internal quality control. (c) The Lead Center shall be open to the public... provide administrative services and coordination for the SBDC network, including program development... Project Officer as soon as is feasible. Other SBDC service providers shall be open during the normal...

  14. 76 FR 12126 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... behavior. All 147 networked crisis centers will complete the Web-based Crisis Center Survey annually. The Survey requests information about organizational structure, staffing, scope of services, call center operations, quality assurance, community outreach/marketing, telephone equipment, data collection, and...

  15. Measurement results obtained from air quality monitoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turzanski, P.K.; Beres, R.

    1995-12-31

    An automatic system of air pollution monitoring operates in Cracow since 1991. The organization, assembling and start-up of the network is a result of joint efforts of the US Environmental Protection Agency and the Cracow environmental protection service. At present the automatic monitoring network is operated by the Provincial Inspection of Environmental Protection. There are in total seven stationary stations situated in Cracow to measure air pollution. These stations are supported continuously by one semi-mobile (transportable) station. It allows to modify periodically the area under investigation and therefore the 3-dimensional picture of creation and distribution of air pollutants within Cracowmore » area could be more intelligible.« less

  16. Space-based Networking Technology Developments in the Interplanetary Network Directorate Information Technology Program

    NASA Technical Reports Server (NTRS)

    Clare, Loren; Clement, B.; Gao, J.; Hutcherson, J.; Jennings, E.

    2006-01-01

    Described recent development of communications protocols, services, and associated tools targeted to reduce risk, reduce cost and increase efficiency of IND infrastructure and supported mission operations. Space-based networking technologies developed were: a) Provide differentiated quality of service (QoS) that will give precedence to traffic that users have selected as having the greatest importance and/or time-criticality; b) Improve the total value of information to users through the use of QoS prioritization techniques; c) Increase operational flexibility and improve command-response turnaround; d) Enable new class of networked and collaborative science missions; e) Simplify applications interfaces to communications services; and f) Reduce risk and cost from a common object model and automated scheduling and communications protocols. Technologies are described in three general areas: communications scheduling, middleware, and protocols. Additionally developed simulation environment, which provides comprehensive, quantitative understanding of the technologies performance within overall, evolving architecture, as well as ability to refine & optimize specific components.

  17. Regulation of distribution network business

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roman, J.; Gomez, T.; Munoz, A.

    1999-04-01

    The traditional distribution function actually comprises two separate activities: distribution network and retailing. Retailing, which is also termed supply, consists of trading electricity at the wholesale level and selling it to the end users. The distribution network business, or merely distribution, is a natural monopoly and it must be regulated. Increasing attention is presently being paid to the regulation of distribution pricing. Distribution pricing, comprises two major tasks: global remuneration of the distribution utility and tariff setting by allocation of the total costs among all the users of the network services. In this paper, the basic concepts for establishing themore » global remuneration of a distribution utility are presented. A remuneration scheme which recognizes adequate investment and operation costs, promotes losses reduction and incentivates the control of the quality of service level is proposed. Efficient investment and operation costs are calculated by using different types of strategic planning and regression analysis models. Application examples that have been used during the distribution regulation process in Spain are also presented.« less

  18. An Efficient Wireless Sensor Network for Industrial Monitoring and Control.

    PubMed

    Aponte-Luis, Juan; Gómez-Galán, Juan Antonio; Gómez-Bravo, Fernando; Sánchez-Raya, Manuel; Alcina-Espigado, Javier; Teixido-Rovira, Pedro Miguel

    2018-01-10

    This paper presents the design of a wireless sensor network particularly designed for remote monitoring and control of industrial parameters. The article describes the network components, protocol and sensor deployment, aimed to accomplish industrial constraint and to assure reliability and low power consumption. A particular case of study is presented. The system consists of a base station, gas sensing nodes, a tree-based routing scheme for the wireless sensor nodes and a real-time monitoring application that operates from a remote computer and a mobile phone. The system assures that the industrial safety quality and the measurement and monitoring system achieves an efficient industrial monitoring operations. The robustness of the developed system and the security in the communications have been guaranteed both in hardware and software level. The system is flexible and can be adapted to different environments. The testing of the system confirms the feasibility of the proposed implementation and validates the functional requirements of the developed devices, the networking solution and the power consumption management.

  19. An Efficient Wireless Sensor Network for Industrial Monitoring and Control

    PubMed Central

    Aponte-Luis, Juan; Gómez-Bravo, Fernando; Sánchez-Raya, Manuel; Alcina-Espigado, Javier; Teixido-Rovira, Pedro Miguel

    2018-01-01

    This paper presents the design of a wireless sensor network particularly designed for remote monitoring and control of industrial parameters. The article describes the network components, protocol and sensor deployment, aimed to accomplish industrial constraint and to assure reliability and low power consumption. A particular case of study is presented. The system consists of a base station, gas sensing nodes, a tree-based routing scheme for the wireless sensor nodes and a real-time monitoring application that operates from a remote computer and a mobile phone. The system assures that the industrial safety quality and the measurement and monitoring system achieves an efficient industrial monitoring operations. The robustness of the developed system and the security in the communications have been guaranteed both in hardware and software level. The system is flexible and can be adapted to different environments. The testing of the system confirms the feasibility of the proposed implementation and validates the functional requirements of the developed devices, the networking solution and the power consumption management. PMID:29320466

  20. Fault discovery protocol for passive optical networks

    NASA Astrophysics Data System (ADS)

    Hajduczenia, Marek; Fonseca, Daniel; da Silva, Henrique J. A.; Monteiro, Paulo P.

    2007-06-01

    All existing flavors of passive optical networks (PONs) provide an attractive alternative to legacy copper-based access lines deployed between a central office (CO) of the service provider (SP) and a customer site. One of the most challenging tasks for PON network planners is the reduction of the overall cost of employing protection schemes for the optical fiber plant while maintaining a reasonable level of survivability and reducing the downtime, thus ensuring acceptable levels of quality of service (QoS) for end subscribers. The recently growing volume of Ethernet PONs deployment [Kramer, IEEE 802.3, CFI (2006)], connected with low-cost electronic and optical components used in the optical network unit (ONU) modules, results in the situation where remote detection of faulty/active subscriber modules becomes indispensable for proper operation of an EPON system. The problem of the remote detection of faulty ONUs in the system is addressed where the upstream channel is flooded with the cw transmission from one or more damaged ONUs and standard communication is severed, providing a solution that is applicable in any type of PON network, regardless of the operating protocol, physical structure, and data rate.

  1. AS Migration and Optimization of the Power Integrated Data Network

    NASA Astrophysics Data System (ADS)

    Zhou, Junjie; Ke, Yue

    2018-03-01

    In the transformation process of data integration network, the impact on the business has always been the most important reference factor to measure the quality of network transformation. With the importance of the data network carrying business, we must put forward specific design proposals during the transformation, and conduct a large number of demonstration and practice to ensure that the transformation program meets the requirements of the enterprise data network. This paper mainly demonstrates the scheme of over-migrating point-to-point access equipment in the reconstruction project of power data comprehensive network to migrate the BGP autonomous domain to the specified domain defined in the industrial standard, and to smooth the intranet OSPF protocol Migration into ISIS agreement. Through the optimization design, eventually making electric power data network performance was improved on traffic forwarding, traffic forwarding path optimized, extensibility, get larger, lower risk of potential loop, the network stability was improved, and operational cost savings, etc.

  2. Effect of filtration of signals of brain activity on quality of recognition of brain activity patterns using artificial intelligence methods

    NASA Astrophysics Data System (ADS)

    Hramov, Alexander E.; Frolov, Nikita S.; Musatov, Vyachaslav Yu.

    2018-02-01

    In present work we studied features of the human brain states classification, corresponding to the real movements of hands and legs. For this purpose we used supervised learning algorithm based on feed-forward artificial neural networks (ANNs) with error back-propagation along with the support vector machine (SVM) method. We compared the quality of operator movements classification by means of EEG signals obtained experimentally in the absence of preliminary processing and after filtration in different ranges up to 25 Hz. It was shown that low-frequency filtering of multichannel EEG data significantly improved accuracy of operator movements classification.

  3. The automated ground network system

    NASA Technical Reports Server (NTRS)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  4. Novel Method for Detection of Air Pollution using Cellular Communication Networks

    NASA Astrophysics Data System (ADS)

    David, N.; Gao, O. H.

    2016-12-01

    Air pollution can lead to a wide spectrum of severe and chronic health impacts. Conventional tools for monitoring the phenomenon do not provide a sufficient monitoring solution in a global scale since they are, for example, not representative of the larger space or due to limited deployment as a result of practical limitations, such as: acquisition, installation, and ongoing maintenance costs. Near ground temperature inversions are directly identified with air pollution events since they suppress vertical atmospheric movement and trap pollutants near the ground. Wireless telecommunication links that comprise the data transfer infrastructure in cellular communication networks operate at frequencies of tens of GHz and are affected by different atmospheric phenomena. These systems are deployed near ground level across the globe, including in developing countries such as India, countries in Africa, etc. Many cellular providers routinely store data regarding the received signal levels in the network for quality assurance needs. Temperature inversions cause atmospheric layering, and change the refractive index of the air when compared to standard conditions. As a result, the ducts that are formed can operate, in essence, as atmospheric wave guides, and cause interference (signal amplification / attenuation) in the microwaves measured by the wireless network. Thus, this network is in effect, an existing system of environmental sensors for monitoring temperature inversions and the episodes of air pollution identified with them. This work presents the novel idea, and demonstrates it, in operation, over several events of air pollution which were detected by a standard cellular communication network during routine operation. Reference: David, N. and Gao, H.O. Using cellular communication networks to detect air pollution, Environmental Science & Technology, 2016 (accepted).

  5. Amelioration de la qualite d'energie d'un systeme de conversion d'energie eolienne a base de machine asynchrone a double alimentation et connecte au reseau electrique =

    NASA Astrophysics Data System (ADS)

    Abderrahim, Iheb

    Wind power generation has grown strongly in the last decade. This results in the development of Wind Energy Conversion System WECS at the levels of modeling and electrical control. Modern WECS operate at varying wind speeds and are equipped with synchronous and asynchronous generators. Among these generators, the Doubly-Fed Induction Generator (DFIG) offers several advantages and capabilities of active and reactive power in four quadrants. WECS based DFIG also causes less conversion costs and minimum energy losses compared with a WECS based on a synchronous generator powered entirely by full scale of power converters. The connection of such a system to the electrical distribution network involves bidirectional operation of networks. This is clearly established in sub and super synchronous operating modes of DFIG. The grid provides the active power to the rotor of DFIG in sub synchronous operating mode and receives the active power of the rotor in super synchronous operating mode of DFIG. Energy quality is thus of major importance during the integration of wind power to the grid. Poor wave quality can affect network stability and could even cause major problems and consequences. This is even more critical where non-linear loads such as the switching power supplies and variable speed drives, are connected to the grid. The idea of this research work is how to mitigate the problems associated with the wave quality while ensuring better implementation of DFIG so that the whole of WECS remains insensitive to external disturbances and parametric variations. The Grid Side Converter (GSC) must be able to compensate harmonics, current unbalance and reactive power injected by a nonlinear three-phase unbalanced load connected to the grid. In addition to these innovative features to improve the conditions of operation of the grid, it provides also the power flow during different modes of operation of the DFIG. It is considered a simple, efficient and cost competitive solution by saving the use of other power equipment. At the same time, the energy efficiency of wind power conversion chain should be improved by extracting the MPPT. Searching allows us to select vector control and control in synchronous reference to achieve these objectives. WECS based DFIG is simulated in MATLAB SIMULINK in the presence of a non-linear balanced and unbalanced three-phase load.

  6. Internet Tomography in Support of Internet and Network Simulation and Emulation Modelling

    NASA Astrophysics Data System (ADS)

    Moloisane, A.; Ganchev, I.; O'Droma, M.

    Internet performance measurement data extracted through Internet Tomography techniques and metrics and how it may be used to enhance the capacity of network simulation and emulation modelling is addressed in this paper. The advantages of network simulation and emulation as a means to aid design and develop the component networks, which make up the Internet and are fundamental to its ongoing evolution, are highlighted. The Internet's rapid growth has spurred development of new protocols and algorithms to meet changing operational requirements such as security, multicast delivery, mobile networking, policy management, and quality of service (QoS) support. Both the development and evaluation of these operational tools requires the answering of many design and operational questions. Creating the technical support required by network engineers and managers in their efforts to seek answers to these questions is in itself a major challenge. Within the Internet the number and range of services supported continues to grow exponentially, from legacy and client/server applications to VoIP, multimedia streaming services and interactive multimedia services. Services have their own distinctive requirements and idiosyncrasies. They respond differently to bandwidth limitations, latency and jitter problems. They generate different types of “conversations” between end-user terminals, back-end resources and middle-tier servers. To add to the complexity, each new or enhanced service introduced onto the network contends for available bandwidth with every other service. In an effort to ensure networking products and resources being designed and developed handling diverse conditions encountered in real Internet environments, network simulation and emulation modelling is a valuable tool, and becoming a critical element, in networking product and application design and development. The better these laboratory tools reflect real-world environment and conditions the more helpful to designers they will be.

  7. The wireless networking system of Earthquake precursor mobile field observation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within an acceptable range and do not affect real-time observation curve. After field running test and earthquake tracking project applications, the field mobile observation wireless networking system is operate normally, various function have good operability and show good performance, the quality of data transmission meet the system design requirements and play a significant role in practical applications.

  8. Application of Six Sigma methodology to a cataract surgery unit.

    PubMed

    Taner, Mehmet Tolga

    2013-01-01

    The article's aim is to focus on the application of Six Sigma to minimise intraoperative and post-operative complications rates in a Turkish public hospital cataract surgery unit. Implementing define-measure-analyse-improve and control (DMAIC) involves process mapping, fishbone diagrams and rigorous data-collection. Failure mode and effect analysis (FMEA), pareto diagrams, control charts and process capability analysis are applied to redress cataract surgery failure root causes. Inefficient skills of assistant surgeons and technicians, low quality of IOLs used, wrong IOL placement, unsystematic sterilisation of surgery rooms and devices, and the unprioritising network system are found to be the critical drivers of intraoperative-operative and post-operative complications. Sigma level was increased from 2.60 to 3.75 subsequent to extensive training of assistant surgeons, ophthalmologists and technicians, better quality IOLs, systematic sterilisation and air-filtering, and the implementation of a more sophisticated network system. This article shows that Six Sigma measurement and process improvement can become the impetus for cataract unit staff to rethink their process and reduce malpractices. Measuring, recording and reporting data regularly helps them to continuously monitor their overall process and deliver safer treatments. This is the first Six Sigma ophthalmology study in Turkey.

  9. Effects of equipment performance on data quality from the National Atmospheric Deposition Program/National Trends Network and the Mercury Deposition Network

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Rhodes, Mark F.

    2013-01-01

    The U.S. Geological Survey Branch of Quality Systems operates the Precipitation Chemistry Quality Assurance project (PCQA) to provide independent, external quality-assurance for the National Atmospheric Deposition Program (NADP). NADP is composed of five monitoring networks that measure the chemical composition of precipitation and ambient air. PCQA and the NADP Program Office completed five short-term studies to investigate the effects of equipment performance with respect to the National Trends Network (NTN) and Mercury Deposition Network (MDN) data quality: sample evaporation from NTN collectors; sample volume and mercury loss from MDN collectors; mercury adsorption to MDN collector glassware, grid-type precipitation sensors for precipitation collectors, and the effects of an NTN collector wind shield on sample catch efficiency. Sample-volume evaporation from an NTN Aerochem Metrics (ACM) collector ranged between 1.1–33 percent with a median of 4.7 percent. The results suggest that weekly NTN sample evaporation is small relative to sample volume. MDN sample evaporation occurs predominantly in western and southern regions of the United States (U.S.) and more frequently with modified ACM collectors than with N-CON Systems Inc. collectors due to differences in airflow through the collectors. Variations in mercury concentrations, measured to be as high as 47.5 percent per week with a median of 5 percent, are associated with MDN sample-volume loss. Small amounts of mercury are also lost from MDN samples by adsorption to collector glassware irrespective of collector type. MDN 11-grid sensors were found to open collectors sooner, keep them open longer, and cause fewer lid cycles than NTN 7-grid sensors. Wind shielding an NTN ACM collector resulted in collection of larger quantities of precipitation while also preserving sample integrity.

  10. Qualitative Comparison of Streamflow Information Programs of the U.S. Geological Survey and Three Non-Federal Agencies

    USGS Publications Warehouse

    Norris, J. Michael; Lewis, Michael; Dorsey, Michael; Kimbrough, Robert; Holmes, Robert R.; Staubitz, Ward

    2008-01-01

    A qualitative comparison was made of the streamgaging programs of the U.S. Geological Survey (USGS) and three non-Federal agencies in terms of approximate costs and streamflow-information products produced. The three non-Federal agencies provided the USGS with detailed information on their streamgaging program and related costs, and the USGS explored, through publicly available Web sites and one-on-one discussions, the comparability of the streamflow information produced. The type and purpose of streamgages operated, the quality of streamflow record produced, and cost-accounting methods have a great effect on streamgaging costs. There are many uses of streamflow information, and the information requirements for streamgaging programs differ greatly across this range of purposes. A premise of the USGS streamgaging program is that the network must produce consistent data of sufficient quality to support the broadest range of possible uses. Other networks may have a narrower range of purposes; as a consequence, the method of operation, data-quality objectives, and information delivery may be different from those for a multipurpose network. As a result, direct comparison of the overall cost (or of the cost per streamgage) among these programs is not possible. The analysis is, nonetheless, very instructive and provides USGS program managers, agency leadership, and other agency streamgaging program managers useful insight to influence future decisions. Even though the comparison of streamgaging costs and streamflow information products was qualitative, this analysis does offer useful insights on longstanding questions of USGS streamgaging costs.

  11. A review of international biobanks and networks: success factors and key benchmarks.

    PubMed

    Vaught, Jim; Kelly, Andrea; Hewitt, Robert

    2009-09-01

    Biobanks and biobanking networks are involved in varying degrees in the collection, processing, storage, and dissemination of biological specimens. This review outlines the approaches that 16 of the largest biobanks and biobanking networks in Europe, North America, Australia, and Asia have taken to collecting and distributing human research specimens and managing scientific initiatives while covering operating costs. Many are small operations that exist as either a single or a few freezers in a research laboratory, hospital clinical laboratory, or pathology suite. Larger academic and commercial biobanks operate to support large clinical and epidemiological studies. Operational and business models depend on the medical and research missions of their institutions and home countries. Some national biobanks operate with a centralized physical biobank that accepts samples from multiple locations. Others operate under a "federated" model where each institution maintains its own collections but agrees to list them on a central shared database. Some collections are "project-driven" meaning that specimens are collected and distributed to answer specific research questions. "General" collections are those that exist to establish a reference collection, that is, not to meet particular research goals but to be available to respond to multiple requests for an assortment of research uses. These individual and networked biobanking systems operate under a variety of business models, usually incorporating some form of partial cost recovery, while requiring at least partial public or government funding. Each has a well-defined biospecimen-access policy in place that specifies requirements that must be met-such as ethical clearance and the expertise to perform the proposed experiments-to obtain samples for research. The success of all of these biobanking models depends on a variety of factors including well-defined goals, a solid business plan, and specimen collections that are developed according to strict quality and operational controls.

  12. Mapping a Careflow Network to assess the connectedness of Connected Health.

    PubMed

    Carroll, Noel; Richardson, Ita

    2017-04-01

    Connected Health is an emerging and rapidly developing field which has the potential to transform healthcare service systems by increasing its safety, quality and overall efficiency. From a healthcare perspective, process improvement models have mainly focused on the static workflow viewpoint. The objective of this article is to study and model the dynamic nature of healthcare delivery, allowing us to identify where potential issues exist within the service system and to examine how Connected Health technological solutions may support service efficiencies. We explore the application of social network analysis (SNA) as a modelling technique which captures the dynamic nature of a healthcare service. We demonstrate how it can be used to map the 'Careflow Network' and guide Connected Health innovators to examine specific opportunities within the healthcare service. Our results indicate that healthcare technology must be correctly identified and implemented within the Careflow Network to enjoy improvements in service delivery. Oftentimes, prior to making the transformation to Connected Health, researchers use various modelling techniques that fail to identify where Connected Health innovation is best placed in a healthcare service network. Using SNA allows us to develop an understanding of the current operation of healthcare system within which they can effect change. It is important to identify and model the resource exchanges to ensure that the quality and safety of care are enhanced, efficiencies are increased and the overall healthcare service system is improved. We have shown that dynamic models allow us to study the exchange of resources. These are often intertwined within a socio-technical context in an informal manner and not accounted for in static models, yet capture a truer insight on the operations of a Careflow Network.

  13. [Establishing and operating a human biobank. Ethical aspects].

    PubMed

    Jahns, Roland

    2016-03-01

    Particularly in the past decade which has been marked by efforts to foster individualized/personalized medicine the need for well-characterized high-quality collections of human biological material has significantly increased. When establishing and operating a human biobank the interests and the "freedom" of biomedical research must always be weighed against the interests and rights of patients and/or donors; in this process ethical aspects should be considered systematically. In addition, the importance of quality control and quality assurance has largely increased in human biobanking, both from a scientific and even more from an ethical point of view, because donated biological materials are potentially stored for decades and (on request) might serve for currently not foreseeable biomedical research purposes. In addition, the compatibility of national human biobanks with international biobank networks becomes increasingly important.

  14. Data-centric multiobjective QoS-aware routing protocol for body sensor networks.

    PubMed

    Razzaque, Md Abdur; Hong, Choong Seon; Lee, Sungwon

    2011-01-01

    In this paper, we address Quality-of-Service (QoS)-aware routing issue for Body Sensor Networks (BSNs) in delay and reliability domains. We propose a data-centric multiobjective QoS-Aware routing protocol, called DMQoS, which facilitates the system to achieve customized QoS services for each traffic category differentiated according to the generated data types. It uses modular design architecture wherein different units operate in coordination to provide multiple QoS services. Their operation exploits geographic locations and QoS performance of the neighbor nodes and implements a localized hop-by-hop routing. Moreover, the protocol ensures (almost) a homogeneous energy dissipation rate for all routing nodes in the network through a multiobjective Lexicographic Optimization-based geographic forwarding. We have performed extensive simulations of the proposed protocol, and the results show that DMQoS has significant performance improvements over several state-of-the-art approaches.

  15. Operating a global seismic network - perspectives from the USGS GSN

    NASA Astrophysics Data System (ADS)

    Gee, L. S.; Derr, J. S.; Hutt, C. R.; Bolton, H.; Ford, D.; Gyure, G. S.; Storm, T.; Leith, W.

    2007-05-01

    The Global Seismographic Network (GSN) is a permanent digital network of state-of-the-art seismological and geophysical sensors connected by a global telecommunications network, serving as a multi-use scientific facility used for seismic monitoring for response applications, basic and applied research in solid earthquake geophysics, and earth science education. A joint program of the U.S. Geological Survey (USGS), the National Science Foundation, and Incorporated Research Institutions in Seismology (IRIS), the GSN provides near- uniform, worldwide monitoring of the Earth through 144 modern, globally distributed seismic stations. The USGS currently operates 90 GSN or GSN-affiliate stations. As a US government program, the USGS GSN is evaluated on several performance measures including data availability, data latency, and cost effectiveness. The USGS-component of the GSN, like the GSN as a whole, is in transition from a period of rapid growth to steady- state operations. The program faces challenges of aging equipment and increased operating costs at the same time that national and international earthquake and tsunami monitoring agencies place an increased reliance on GSN data. Data acquisition of the USGS GSN is based on the Quanterra Q680 datalogger, a workhorse system that is approaching twenty years in the field, often in harsh environments. An IRIS instrumentation committee recently selected the Quanterra Q330 HR as the "next generation" GSN data acquisition system, and the USGS will begin deploying the new equipment in the middle of 2007. These new systems will address many of the issues associated with the ageing Q680 while providing a platform for interoperability across the GSN.. In order to address the challenge of increasing operational costs, the USGS employs several tools. First, the USGS benefits from the contributions of local host institutions. The station operators are the first line of defense when a station experiences problems, changing boards, swapping cables, and re-centering sensors. In order to facilitate this effort, the USGS maintains supplies of on-site spares at a number of stations, primarily at those with difficult shipping or travel logistics. In addition, the USGS is moving toward the GSN standard of installing a secondary broadband sensor at each site, to serve as a backup in case of failure of the primary broadband sensor. The recent transition to real-time telemetry has been an enormous boon for station operations as well as for earthquake and tsunami monitoring. For example, the USGS examines waveforms daily for data dropouts (gaps), out-of-nominal range data values, and overall noise levels. Higher level quality control focuses on problems in sensitivity, timing, polarity, orientation, and general instrument behavior. The quality control operations are essential for quickly identifying problems with stations, allowing for remedial or preventive maintenance that preserves data continuity and quality and minimizes catastrophic failure of the station or significant loss of data. The USGS tracks network performance using a variety of tools. Through Web pages with plots of waveforms (heliplots), data latency, and data availability, quick views of station status are available. The USGS has recently implemented other monitoring tools, such as SeisNetWatch, for evaluating station state of health.

  16. A statistical model for water quality predictions from a river discharge using coastal observations

    NASA Astrophysics Data System (ADS)

    Kim, S.; Terrill, E. J.

    2007-12-01

    Understanding and predicting coastal ocean water quality has benefits for reducing human health risks, protecting the environment, and improving local economies which depend on clean beaches. Continuous observations of coastal physical oceanography increase the understanding of the processes which control the fate and transport of a riverine plume which potentially contains high levels of contaminants from the upstream watershed. A data-driven model of the fate and transport of river plume water from the Tijuana River has been developed using surface current observations provided by a network of HF radar operated as part of a local coastal observatory that has been in place since 2002. The model outputs are compared with water quality sampling of shoreline indicator bacteria, and the skill of an alarm for low water quality is evaluated using the receiver operating characteristic (ROC) curve. In addition, statistical analysis of beach closures in comparison with environmental variables is also discussed.

  17. Quantifying the Impact of Feedstock Quality on the Design of Bioenergy Supply Chain Networks

    DOE PAGES

    Castillo-Villar, Krystel; Minor-Popocatl, Hertwin; Webb, Erin

    2016-03-01

    Logging residues, which refer to the unused portions of trees cut during logging, are important sources of biomass for the emerging biofuel industry and are critical feedstocks for the first-type biofuel facilities (e.g., corn-ethanol facilities). Logging residues are under-utilized sources of biomass for energetic purposes. To support the scaling-up of the bioenergy industry, it is essential to design cost-effective biofuel supply chains that not only minimize costs, but also consider the biomass quality characteristics. The biomass quality is heavily dependent upon the moisture and the ash contents. Ignoring the biomass quality characteristics and its intrinsic costs may yield substantial economicmore » losses that will only be discovered after operations at a biorefinery have begun. Here this paper proposes a novel bioenergy supply chain network design model that minimizes operational costs and includes the biomass quality-related costs. The proposed model is unique in the sense that it supports decisions where quality is not unrealistically assumed to be perfect. The effectiveness of the proposed methodology is proven by assessing a case study in the state of Tennessee, USA. The results demonstrate that the ash and moisture contents of logging residues affect the performance of the supply chain (in monetary terms). Higher-than-target moisture and ash contents incur in additional quality-related costs. The quality-related costs in the optimal solution (with final ash content of 1% and final moisture of 50%) account for 27% of overall supply chain cost. In conclusion, based on the numeral experimentation, the total supply chain cost increased 7%, on average, for each additional percent in the final ash content.« less

  18. Quantifying the Impact of Feedstock Quality on the Design of Bioenergy Supply Chain Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castillo-Villar, Krystel; Minor-Popocatl, Hertwin; Webb, Erin

    Logging residues, which refer to the unused portions of trees cut during logging, are important sources of biomass for the emerging biofuel industry and are critical feedstocks for the first-type biofuel facilities (e.g., corn-ethanol facilities). Logging residues are under-utilized sources of biomass for energetic purposes. To support the scaling-up of the bioenergy industry, it is essential to design cost-effective biofuel supply chains that not only minimize costs, but also consider the biomass quality characteristics. The biomass quality is heavily dependent upon the moisture and the ash contents. Ignoring the biomass quality characteristics and its intrinsic costs may yield substantial economicmore » losses that will only be discovered after operations at a biorefinery have begun. Here this paper proposes a novel bioenergy supply chain network design model that minimizes operational costs and includes the biomass quality-related costs. The proposed model is unique in the sense that it supports decisions where quality is not unrealistically assumed to be perfect. The effectiveness of the proposed methodology is proven by assessing a case study in the state of Tennessee, USA. The results demonstrate that the ash and moisture contents of logging residues affect the performance of the supply chain (in monetary terms). Higher-than-target moisture and ash contents incur in additional quality-related costs. The quality-related costs in the optimal solution (with final ash content of 1% and final moisture of 50%) account for 27% of overall supply chain cost. In conclusion, based on the numeral experimentation, the total supply chain cost increased 7%, on average, for each additional percent in the final ash content.« less

  19. Broadband Optical Access Technologies to Converge towards a Broadband Society in Europe

    NASA Astrophysics Data System (ADS)

    Coudreuse, Jean-Pierre; Pautonnier, Sophie; Lavillonnière, Eric; Didierjean, Sylvain; Hilt, Benoît; Kida, Toshimichi; Oshima, Kazuyoshi

    This paper provides insights on the status of broadband optical access market and technologies in Europe and on the expected trends for the next generation optical access networks. The final target for most operators, cities or any other player is of course FTTH (Fibre To The Home) deployment although we can expect intermediate steps with copper or wireless technologies. Among the two candidate architectures for FTTH, PON (Passive Optical Network) is by far the most attractive and cost effective solution. We also demonstrate that Ethernet based optical access network is very adequate to all-IP networks without any incidence on the level of quality of service. Finally, we provide feedback from a FTTH pilot network in Colmar (France) based on Gigabit Ethernet PON technology. The interest of this pilot lies on the level of functionality required for broadband optical access networks but also on the development of new home network configurations.

  20. Maintaining High Quality Data and Consistency Across a Diverse Flux Network: The Ameriflux QA/QC Technical Team

    NASA Astrophysics Data System (ADS)

    Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.

    2014-12-01

    The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.

  1. Entanglement distillation between solid-state quantum network nodes.

    PubMed

    Kalb, N; Reiserer, A A; Humphreys, P C; Bakermans, J J W; Kamerling, S J; Nickerson, N H; Benjamin, S C; Twitchen, D J; Markham, M; Hanson, R

    2017-06-02

    The impact of future quantum networks hinges on high-quality quantum entanglement shared between network nodes. Unavoidable imperfections necessitate a means to improve remote entanglement by local quantum operations. We realize entanglement distillation on a quantum network primitive of distant electron-nuclear two-qubit nodes. The heralded generation of two copies of a remote entangled state is demonstrated through single-photon-mediated entangling of the electrons and robust storage in the nuclear spins. After applying local two-qubit gates, single-shot measurements herald the distillation of an entangled state with increased fidelity that is available for further use. The key combination of generating, storing, and processing entangled states should enable the exploration of multiparticle entanglement on an extended quantum network. Copyright © 2017, American Association for the Advancement of Science.

  2. The NASA Micro-Pulse Lidar Network (MPLNET): Co-location of Lidars with AERONET

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Berkoff, Timothy A.; Spinhirne, James D.; Holben, Brent; Tsay, Si-Chee

    2004-01-01

    We present the formation of a global-ground based eye-safe lidar network, the NASA Micro-Pulse Lidar Network (MPLNET). The aim of MPLNET is to acquire long-term observations of aerosol and cloud vertical profiles at unique geographic sites within the NASA Aerosol Robotic Network (AERONET). Network growth follows a federated approach, pioneered by AERONET, wherein independent research groups may join MPLNET with their own instrument and site. MPLNET utilizes standard instrumentation and data processing algorithms for efficient network operations and direct comparison of data between each site. The micro-pulse lidar is eye-safe, compact, and commercially available, and most easily allows growth of the network without sacrificing standardized instrumentation gods. Red-time data products (next-day) are available, and include Level 1 daily lidar signal images from the surface to -2Okm, and Level 1.5 aerosol extinction provides at times co-incident with AERONET observations. Testing of our quality assured aerosol extinction products, Level 2, is near completion and data will soon be available. Level 3 products, continuous daylight aerosol extinction profiles, are under development and testing has begun. An overview of h4PL" will be presented. Successful methods of merging standardized lidar operations with AERONET will also be discussed, with the first 4 years of MPLNET results serving as an example.

  3. The EuroBioBank Network: 10 years of hands-on experience of collaborative, transnational biobanking for rare diseases

    PubMed Central

    Mora, Marina; Angelini, Corrado; Bignami, Fabrizia; Bodin, Anne-Mary; Crimi, Marco; Di Donato, Jeanne- Hélène; Felice, Alex; Jaeger, Cécile; Karcagi, Veronika; LeCam, Yann; Lynn, Stephen; Meznaric, Marija; Moggio, Maurizio; Monaco, Lucia; Politano, Luisa; de la Paz, Manuel Posada; Saker, Safaa; Schneiderat, Peter; Ensini, Monica; Garavaglia, Barbara; Gurwitz, David; Johnson, Diana; Muntoni, Francesco; Puymirat, Jack; Reza, Mojgan; Voit, Thomas; Baldo, Chiara; Bricarelli, Franca Dagna; Goldwurm, Stefano; Merla, Giuseppe; Pegoraro, Elena; Renieri, Alessandra; Zatloukal, Kurt; Filocamo, Mirella; Lochmüller, Hanns

    2015-01-01

    The EuroBioBank (EBB) network (www.eurobiobank.org) is the first operating network of biobanks in Europe to provide human DNA, cell and tissue samples as a service to the scientific community conducting research on rare diseases (RDs). The EBB was established in 2001 to facilitate access to RD biospecimens and associated data; it obtained funding from the European Commission in 2002 (5th framework programme) and started operation in 2003. The set-up phase, during the EC funding period 2003–2006, established the basis for running the network; the following consolidation phase has seen the growth of the network through the joining of new partners, better network cohesion, improved coordination of activities, and the development of a quality-control system. During this phase the network participated in the EC-funded TREAT-NMD programme and was involved in planning of the European Biobanking and Biomolecular Resources Research Infrastructure. Recently, EBB became a partner of RD-Connect, an FP7 EU programme aimed at linking RD biobanks, registries, and bioinformatics data. Within RD-Connect, EBB contributes expertise, promotes high professional standards, and best practices in RD biobanking, is implementing integration with RD patient registries and ‘omics' data, thus challenging the fragmentation of international cooperation on the field. PMID:25537360

  4. Post-installation activities in the Comprehensive Nuclear Test Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Hoffmann, T. L.; Campus, P.; Bell, M.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.; Wu, Sean F.

    2002-11-01

    The provisional operation and maintenance of IMS infrasound stations after installation and subsequent certification has the objective to prepare the infrasound network for entry into force of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The goal is to maintain and fine tune the technical capabilities of the network, to repair faulty equipment, and to ensure that stations continue to meet the minimum specifications through evaluation of data quality and station recalibration. Due to the globally dispersed nature of the network, this program constitutes a significant undertaking that requires careful consideration of possible logistic approaches and their financial implications. Currently, 11 of the 60 IMS infrasound stations are transmitting data in the post-installation Testing & Evaluation mode. Another 5 stations are under provisional operation and are maintained in post-certification mode. It is expected that 20% of the infrasound network will be certified by the end of 2002. This presentation will focus on the different phases of post-installation activities of the IMS infrasound program and the logistical challenges to be tackled to ensure a cost-efficient management of the network. Specific topics will include Testing & Evaluation and Certification of Infrasound Stations, as well as Configuration Management and Network Sustainment.

  5. Data Auditor: Analyzing Data Quality Using Pattern Tableaux

    NASA Astrophysics Data System (ADS)

    Srivastava, Divesh

    Monitoring databases maintain configuration and measurement tables about computer systems, such as networks and computing clusters, and serve important business functions, such as troubleshooting customer problems, analyzing equipment failures, planning system upgrades, etc. These databases are prone to many data quality issues: configuration tables may be incorrect due to data entry errors, while measurement tables may be affected by incorrect, missing, duplicate and delayed polls. We describe Data Auditor, a tool for analyzing data quality and exploring data semantics of monitoring databases. Given a user-supplied constraint, such as a boolean predicate expected to be satisfied by every tuple, a functional dependency, or an inclusion dependency, Data Auditor computes "pattern tableaux", which are concise summaries of subsets of the data that satisfy or fail the constraint. We discuss the architecture of Data Auditor, including the supported types of constraints and the tableau generation mechanism. We also show the utility of our approach on an operational network monitoring database.

  6. Quality of Surface Water in Missouri, Water Year 2007

    USGS Publications Warehouse

    Otero-Benitez, William; Davis, Jerri V.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2007 water year (October 1, 2006 through September 30, 2007), data were collected at 67 stations including two U.S. Geological Survey National Stream Quality Accounting Network stations and one spring sampled in cooperation with the U.S. Forest Service. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, dissolved nitrite plus nitrte, total phosphorus, dissolved and total recoverable lead and zinc, and selected pesticide data summaries are presented for 64 of these stations, which primarily have been classified in groups corresponding to the physiography of the State, main land use, or unique station types. In addition, a summary of hydrologic conditions in the State during water year 2007 is presented.

  7. Internet Voice Distribution System (IVoDS) Utilization in Remote Payload Operations

    NASA Technical Reports Server (NTRS)

    Best, Susan; Bradford, Bob; Chamberlain, Jim; Nichols, Kelvin; Bailey, Darrell (Technical Monitor)

    2002-01-01

    Due to limited crew availability to support science and the large number of experiments to be operated simultaneously, telescience is key to a successful International Space Station (ISS) science program. Crew, operations personnel at NASA centers, and researchers at universities and companies around the world must work closely together to perform scientific experiments on-board ISS. NASA has initiated use of Voice over Internet Protocol (VoIP) to supplement the existing HVoDS mission voice communications system used by researchers. The Internet Voice Distribution System (IVoDS) connects researchers to mission support "loops" or conferences via Internet Protocol networks such as the high-speed Internet 2. Researchers use IVoDS software on personal computers to talk with operations personnel at NASA centers. IVoDS also has the capability, if authorized, to allow researchers to communicate with the ISS crew during experiment operations. NODS was developed by Marshall Space Flight Center with contractors A2 Technology, Inc. FVC, Lockheed- Martin, and VoIP Group. IVoDS is currently undergoing field-testing with full deployment for up to 50 simultaneous users expected in 2002. Research is currently being performed to take full advantage of the digital world - the Personal Computer and Internet Protocol networks - to qualitatively enhance communications among ISS operations personnel. In addition to the current voice capability, video and data-sharing capabilities are being investigated. Major obstacles being addressed include network bandwidth capacity and strict security requirements. Techniques being investigated to reduce and overcome these obstacles include emerging audio-video protocols and network technology including multicast and quality-of-service.

  8. Logic system aids in evaluation of project readiness

    NASA Technical Reports Server (NTRS)

    Maris, S. J.; Obrien, T. J.

    1966-01-01

    Measurement Operational Readiness Requirements /MORR/ assignments logic is used for determining the readiness of a complex project to go forward as planned. The system used logic network which assigns qualities to all important criteria in a project and establishes a logical sequence of measurements to determine what the conditions are.

  9. 78 FR 29134 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-17

    ... quality, clinical operations, implementation, consumer technology, nationwide health information networks and privacy and security. Other groups will be convened to address specific issues as needed. HIT...) Direct the appropriate workgroup or other special group to develop a report for the HIT Standards...

  10. DEVELOPMENT, EVALUATION AND APPLICATION OF AN AUTOMATED EVENT PRECIPITATION SAMPLER FOR NETWORK OPERATION

    EPA Science Inventory

    In 1993, the University of Michigan Air Quality Laboratory (UMAQL) designed a new wet-only precipitation collection system that was utilized in the Lake Michigan Loading Study. The collection system was designed to collect discrete mercury and trace element samples on an event b...

  11. 78 FR 18323 - Notice of Availability of a Draft Programmatic Environmental Assessment of the Proposed United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... Proposed United States Regional Climate Reference Network (USRCRN) AGENCY: National Weather Service (NWS..., is proposing to implement, operate, and manage a USRCRN. With other climate monitoring efforts..., high-quality climate data for use in climate-monitoring activities and for placing current climate...

  12. Tele-transmission of EEG recordings.

    PubMed

    Lemesle, M; Kubis, N; Sauleau, P; N'Guyen The Tich, S; Touzery-de Villepin, A

    2015-03-01

    EEG recordings can be sent for remote interpretation. This article aims to define the tele-EEG procedures and technical guidelines. Tele-EEG is a complete medical act that needs to be carried out with the same quality requirements as a local one in terms of indications, formulation of the medical request and medical interpretation. It adheres to the same quality requirements for its human resources and materials. It must be part of a medical organization (technical and medical network) and follow all rules and guidelines of good medical practices. The financial model of this organization must include costs related to performing the EEG recording, operating and maintenance of the tele-EEG network and medical fees of the physician interpreting the EEG recording. Implementing this organization must be detailed in a convention between all parties involved: physicians, management of the healthcare structure, and the company providing the tele-EEG service. This convention will set rules for network operation and finance, and also the continuous training of all staff members. The tele-EEG system must respect all rules for safety and confidentiality, and ensure the traceability and storing of all requests and reports. Under these conditions, tele-EEG can optimize the use of human resources and competencies in its zone of utilization and enhance the organization of care management. Copyright © 2015. Published by Elsevier SAS.

  13. Coordinated microgrid investment and planning process considering the system operator

    DOE PAGES

    Armendáriz, M.; Heleno, M.; Cardoso, G.; ...

    2017-05-12

    Nowadays, a significant number of distribution systems are facing problems to accommodate more photovoltaic (PV) capacity, namely due to the overvoltages during the daylight periods. This has an impact on the private investments in distributed energy resources (DER), since it occurs exactly when the PV prices are becoming attractive, and the opportunity to an energy transition based on solar technologies is being wasted. In particular, this limitation of the networks is a barrier for larger consumers, such as commercial and public buildings, aiming at investing in PV capacity and start operating as microgrids connected to the MV network. To addressmore » this challenge, this paper presents a coordinated approach to the microgrid investment and planning problem, where the system operator and the microgrid owner collaborate to improve the voltage control capabilities of the distribution network, increasing the PV potential. The results prove that this collaboration has the benefit of increasing the value of the microgrid investments while improving the quality of service of the system and it should be considered in the future regulatory framework.« less

  14. Coordinated microgrid investment and planning process considering the system operator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armendáriz, M.; Heleno, M.; Cardoso, G.

    Nowadays, a significant number of distribution systems are facing problems to accommodate more photovoltaic (PV) capacity, namely due to the overvoltages during the daylight periods. This has an impact on the private investments in distributed energy resources (DER), since it occurs exactly when the PV prices are becoming attractive, and the opportunity to an energy transition based on solar technologies is being wasted. In particular, this limitation of the networks is a barrier for larger consumers, such as commercial and public buildings, aiming at investing in PV capacity and start operating as microgrids connected to the MV network. To addressmore » this challenge, this paper presents a coordinated approach to the microgrid investment and planning problem, where the system operator and the microgrid owner collaborate to improve the voltage control capabilities of the distribution network, increasing the PV potential. The results prove that this collaboration has the benefit of increasing the value of the microgrid investments while improving the quality of service of the system and it should be considered in the future regulatory framework.« less

  15. Challenges in Wireless System Integration as Enablers for Indoor Context Aware Environments

    PubMed Central

    Aguirre, Erik

    2017-01-01

    The advent of fully interactive environments within Smart Cities and Smart Regions requires the use of multiple wireless systems. In the case of user-device interaction, which finds multiple applications such as Ambient Assisted Living, Intelligent Transportation Systems or Smart Grids, among others, large amount of transceivers are employed in order to achieve anytime, anyplace and any device connectivity. The resulting combination of heterogeneous wireless network exhibits fundamental limitations derived from Coverage/Capacity relations, as a function of required Quality of Service parameters, required bit rate, energy restrictions and adaptive modulation and coding schemes. In this context, inherent transceiver density poses challenges in overall system operation, given by multiple node operation which increases overall interference levels. In this work, a deterministic based analysis applied to variable density wireless sensor network operation within complex indoor scenarios is presented, as a function of topological node distribution. The extensive analysis derives interference characterizations, both for conventional transceivers as well as wearables, which provide relevant information in terms of individual node configuration as well as complete network layout. PMID:28704963

  16. [Audiovisual telecommunication by multimedia technology in HNO medicine. ISDN--internet--ATM].

    PubMed

    Plinkert, P K; Plinkert, B; Kurek, R; Zenner, H P

    2000-11-01

    Telemedicine includes all medical activities in diagnosis, therapeutics, or social medicine undertaken by means of an electronic transfer medium, enabling the transmission of visual and acoustic information over long distances to doctors not personally present at the place of the requested consultation. Most experience with telemedicine applications has been gained in the field of diagnosis (teleconsultation, teleradiology, telepathology) and is expanding to quality control and quality assurance. Decisive for each form of application is its availability, practicability, cost, safety, and especially quality of audiovisual transmission. For telesurgical applications, particularly the use of minimally invasive techniques in otorhinolaryngology, head, and neck surgery, the high quality transmission of audiovisual data in real time is necessary. Rapid expansion and further developments in transmission technologies and networks in the last decade have created several technologies with increased quality and costs. In this paper, we tested different transmission media for audiovisual telecommunication--integrated services digital network (ISDN), Internet, and asynchronous transfer mode (ATM)--using real time video transmission of typical operations in otorhinolaryngology. Their applications, costs, and future perspectives are discussed.

  17. Upper bounds of deformation in the Upper Rhine Graben from GPS data - First results from GURN (GNSS Upper Rhine Graben Network)

    NASA Astrophysics Data System (ADS)

    Masson, Frederic; Knoepfler, Andreas; Mayer, Michael; Ulrich, Patrice; Heck, Bernhard

    2010-05-01

    In September 2008, the Institut de Physique du Globe de Strasbourg (Ecole et Observatoire des Sciences de la Terre, EOST) and the Geodetic Institute (GIK) of Karlsruhe University (TH) established a transnational cooperation called GURN (GNSS Upper Rhine Graben Network). Within the GURN initiative these institutions are cooperating in order to establish a highly precise and highly sensitive network of permanently operating GNSS sites for the detection of crustal movements in the Upper Rhine Graben region. At the beginning, the network consisted of the permanently operating GNSS sites of SAPOS®-Baden-Württemberg, different data providers in France (e.g. EOST, Teria, RGP) and some further sites (e.g. IGS). In July 2009, the network was extended to the South when swisstopo (Switzerland) and to the North when SAPOS®-Rheinland-Pfalz joined GURN. Therefore, actually the GNSS network consists of approx. 80 permanently operating reference sites. The presentation will discuss the actual status of GURN, main research goals, and will present first results concerning the data quality as well as time series of a first reprocessing of all available data since 2002 using GAMIT/GLOBK (EOST working group) and the Bernese GPS Software (GIK working group). Based on these time series, the velocity as well as strain fields will be calculated in the future. The GURN initiative is also aiming for the estimation of the upper bounds of deformation in the Upper Rhine Graben region.

  18. The AlpArray Seismic Network: A Large-Scale European Experiment to Image the Alpine Orogen

    NASA Astrophysics Data System (ADS)

    Hetényi, György; Molinari, Irene; Clinton, John; Bokelmann, Götz; Bondár, István; Crawford, Wayne C.; Dessa, Jean-Xavier; Doubre, Cécile; Friederich, Wolfgang; Fuchs, Florian; Giardini, Domenico; Gráczer, Zoltán; Handy, Mark R.; Herak, Marijan; Jia, Yan; Kissling, Edi; Kopp, Heidrun; Korn, Michael; Margheriti, Lucia; Meier, Thomas; Mucciarelli, Marco; Paul, Anne; Pesaresi, Damiano; Piromallo, Claudia; Plenefisch, Thomas; Plomerová, Jaroslava; Ritter, Joachim; Rümpker, Georg; Šipka, Vesna; Spallarossa, Daniele; Thomas, Christine; Tilmann, Frederik; Wassermann, Joachim; Weber, Michael; Wéber, Zoltán; Wesztergom, Viktor; Živčić, Mladen

    2018-04-01

    The AlpArray programme is a multinational, European consortium to advance our understanding of orogenesis and its relationship to mantle dynamics, plate reorganizations, surface processes and seismic hazard in the Alps-Apennines-Carpathians-Dinarides orogenic system. The AlpArray Seismic Network has been deployed with contributions from 36 institutions from 11 countries to map physical properties of the lithosphere and asthenosphere in 3D and thus to obtain new, high-resolution geophysical images of structures from the surface down to the base of the mantle transition zone. With over 600 broadband stations operated for 2 years, this seismic experiment is one of the largest simultaneously operated seismological networks in the academic domain, employing hexagonal coverage with station spacing at less than 52 km. This dense and regularly spaced experiment is made possible by the coordinated coeval deployment of temporary stations from numerous national pools, including ocean-bottom seismometers, which were funded by different national agencies. They combine with permanent networks, which also required the cooperation of many different operators. Together these stations ultimately fill coverage gaps. Following a short overview of previous large-scale seismological experiments in the Alpine region, we here present the goals, construction, deployment, characteristics and data management of the AlpArray Seismic Network, which will provide data that is expected to be unprecedented in quality to image the complex Alpine mountains at depth.

  19. Development of a Zigbee platform for bioinstrumentation.

    PubMed

    Cifuentes, Carlos A; Gentiletti, Gabriel G; Suarez, Marco J; Rodriguez, Luis E

    2010-01-01

    This paper presents the development of a network platform which allows connecting multiple individual wireless devices for transmitting bioelectrics and biomechanics signals for application in a hospital network, or continuous monitoring in a patient's diary life. The Zigbee platform development proposal was made in three stages: 1) Hardware development, including the construction of a prototype network node and the integration of sensors, (2) Evaluation, in order to define the specifications of each node and scope of communication and (3) The Zigbee Network Implementation for bioinstrumentation based on ZigBee Health Care public application profile (ZHC). Finally, this work presents the experimental results based on measurements of Lost Packets and LQI (Link Quality Indicator), and the Zigbee Platform configuration for Bioinstrumentation in operation.

  20. Achieving excellence in veterans healthcare--a balanced scorecard approach.

    PubMed

    Biro, Lawrence A; Moreland, Michael E; Cowgill, David E

    2003-01-01

    This article provides healthcare administrators and managers with a framework and model for developing a balanced scorecard and demonstrates the remarkable success of this process, which brings focus to leadership decisions about the allocation of resources. This scorecard was developed as a top management tool designed to structure multiple priorities of a large, complex, integrated healthcare system and to establish benchmarks to measure success in achieving targets for performance in identified areas. Significant benefits and positive results were derived from the implementation of the balanced scorecard, based upon benchmarks considered to be critical success factors. The network's chief executive officer and top leadership team set and articulated the network's primary operating principles: quality and efficiency in the provision of comprehensive healthcare and support services. Under the weighted benchmarks of the balanced scorecard, the facilities in the network were mandated to adhere to one non-negotiable tenet: providing care that is second to none. The balanced scorecard approach to leadership continuously ensures that this is the primary goal and focal point for all activity within the network. To that end, systems are always in place to ensure that the network is fully successful on all performance measures relating to quality.

  1. Isolated Operation at Hachinohe Micro-Grid Project

    NASA Astrophysics Data System (ADS)

    Takano, Tomihiro; Kojima, Yasuhiro; Temma, Koji; Simomura, Masaru

    To meet the global warming, renewable energy sources like wind, solar and biomass generations are dramatically increasing. Cogeneration systems are also ever-growing to save consumers' energy costs among factories, buildings and homes where lots of thermal loads are expected. According to these dispersed generators growth, their negative impacts to commercial power systems quality become non-negligible, because their unstable output causes network voltage and frequency fluctuation. Micro-grid technology comes to the front to solve the problem and many demonstrative field tests are now going all over the world. This paper presents the control paradigm and its application to Hachinohe micro-gird project, especially focusing on the power quality at isolated operation on which strict condition is imposed.

  2. Operation of remote mobile sensors for security of drinking water distribution systems.

    PubMed

    Perelman, By Lina; Ostfeld, Avi

    2013-09-01

    The deployment of fixed online water quality sensors in water distribution systems has been recognized as one of the key components of contamination warning systems for securing public health. This study proposes to explore how the inclusion of mobile sensors for inline monitoring of various water quality parameters (e.g., residual chlorine, pH) can enhance water distribution system security. Mobile sensors equipped with sampling, sensing, data acquisition, wireless transmission and power generation systems are being designed, fabricated, and tested, and prototypes are expected to be released in the very near future. This study initiates the development of a theoretical framework for modeling mobile sensor movement in water distribution systems and integrating the sensory data collected from stationary and non-stationary sensor nodes to increase system security. The methodology is applied and demonstrated on two benchmark networks. Performance of different sensor network designs are compared for fixed and combined fixed and mobile sensor networks. Results indicate that complementing online sensor networks with inline monitoring can increase detection likelihood and decrease mean time to detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Interactions between neurons in the frontal cortex and hippocampus in cats trained to select reinforcements of different value in conditions of cholinergic deficiency.

    PubMed

    Dolbakyan, E E; Merzhanova, G Kh

    2007-09-01

    An operant food-related conditioned reflex was developed in six cats by the "active choice" protocol: short-latency pedal presses were followed by presentation of low-quality reinforcement (bread-meat mix), while long-latency pedal presses were followed by presentation of high-quality reinforcement (meat). Animals differed in terms of their food-procuring strategies, displaying "self-control," "ambivalence," or "impulsivity." Multineuron activity was recorded from the frontal cortex and hippocampus (field CA3). Cross-correlation analysis of interneuronal interactions within (local networks) and between (distributed networks) study structures showed that the numbers of interneuronal interactions in both local and distributed networks were maximal in animals with "self-control." On the background of systemic administration of the muscarinic cholinoreceptor blockers scopolamine and trihexyphenidyl, the numbers of interneuronal interactions decreased, while "common source" influences increased. This correlated with impairment of the reproduction of the selected strategy, primarily affecting the animals' self-controlled behavior. These results show that the "self-control" strategy is determined by the organization of local and distributed networks in the frontal cortex and hippocampus.

  4. I/O performance evaluation of a Linux-based network-attached storage device

    NASA Astrophysics Data System (ADS)

    Sun, Zhaoyan; Dong, Yonggui; Wu, Jinglian; Jia, Huibo; Feng, Guanping

    2002-09-01

    In a Local Area Network (LAN), clients are permitted to access the files on high-density optical disks via a network server. But the quality of read service offered by the conventional server is not satisfied because of the multiple functions on the server and the overmuch caller. This paper develops a Linux-based Network-Attached Storage (NAS) server. The Operation System (OS), composed of an optimized kernel and a miniaturized file system, is stored in a flash memory. After initialization, the NAS device is connected into the LAN. The administrator and users could configure the access the server through the web page respectively. In order to enhance the quality of access, the management of buffer cache in file system is optimized. Some benchmark programs are peformed to evaluate the I/O performance of the NAS device. Since data recorded in optical disks are usually for reading accesses, our attention is focused on the reading throughput of the device. The experimental results indicate that the I/O performance of our NAS device is excellent.

  5. A participatory sensing approach to characterize ride quality

    NASA Astrophysics Data System (ADS)

    Bridgelall, Raj

    2014-03-01

    Rough roads increase vehicle operation and road maintenance costs. Consequently, transportation agencies spend a significant portion of their budgets on ride-quality characterization to forecast maintenance needs. The ubiquity of smartphones and social media, and the emergence of a connected vehicle environment present lucrative opportunities for cost-reduction and continuous, network-wide, ride-quality characterization. However, there is a lack of models to transform inertial and position information from voluminous data flows into indices that transportation agencies currently use. This work expands on theories of the Road Impact Factor introduced in previous research. The index characterizes road roughness by aggregating connected vehicle data and reporting roughness in direct proportion to the International Roughness Index. Their theoretical relationships are developed, and a case study is presented to compare the relative data quality from an inertial profiler and a regular passenger vehicle. Results demonstrate that the approach is a viable alternative to existing models that require substantially more resources and provide less network coverage. One significant benefit of the participatory sensing approach is that transportation agencies can monitor all network facilities continuously to locate distress symptoms, such as frost heaves, that appear and disappear between ride assessment cycles. Another benefit of the approach is continuous monitoring of all high-risk intersections such as rail grade crossings to better understand the relationship between ride-quality and traffic safety.

  6. Artificial neural network modeling of dissolved oxygen in reservoir.

    PubMed

    Chen, Wei-Bo; Liu, Wen-Cheng

    2014-02-01

    The water quality of reservoirs is one of the key factors in the operation and water quality management of reservoirs. Dissolved oxygen (DO) in water column is essential for microorganisms and a significant indicator of the state of aquatic ecosystems. In this study, two artificial neural network (ANN) models including back propagation neural network (BPNN) and adaptive neural-based fuzzy inference system (ANFIS) approaches and multilinear regression (MLR) model were developed to estimate the DO concentration in the Feitsui Reservoir of northern Taiwan. The input variables of the neural network are determined as water temperature, pH, conductivity, turbidity, suspended solids, total hardness, total alkalinity, and ammonium nitrogen. The performance of the ANN models and MLR model was assessed through the mean absolute error, root mean square error, and correlation coefficient computed from the measured and model-simulated DO values. The results reveal that ANN estimation performances were superior to those of MLR. Comparing to the BPNN and ANFIS models through the performance criteria, the ANFIS model is better than the BPNN model for predicting the DO values. Study results show that the neural network particularly using ANFIS model is able to predict the DO concentrations with reasonable accuracy, suggesting that the neural network is a valuable tool for reservoir management in Taiwan.

  7. High-Density, High-Resolution, Low-Cost Air Quality Sensor Networks for Urban Air Monitoring

    NASA Astrophysics Data System (ADS)

    Mead, M. I.; Popoola, O. A.; Stewart, G.; Bright, V.; Kaye, P.; Saffell, J.

    2012-12-01

    Monitoring air quality in highly granular environments such as urban areas which are spatially heterogeneous with variable emission sources, measurements need to be made at appropriate spatial and temporal scales. Current routine air quality monitoring networks generally are either composed of sparse expensive installations (incorporating e.g. chemiluminescence instruments) or higher density low time resolution systems (e.g. NO2 diffusion tubes). Either approach may not accurately capture important effects such as pollutant "hot spots" or adequately capture spatial (or temporal) variability. As a result, analysis based on data from traditional low spatial resolution networks, such as personal exposure, may be inaccurate. In this paper we present details of a sophisticated, low-cost, multi species (gas phase, speciated PM, meteorology) air quality measurement network methodology incorporating GPS and GPRS which has been developed for high resolution air quality measurements in urban areas. Sensor networks developed in the Centre for Atmospheric Science (University of Cambridge) incorporated electrochemical gas sensors configured for use in urban air quality studies operating at parts-per-billion (ppb) levels. It has been demonstrated that these sensors can be used to measure key air quality gases such as CO, NO and NO2 at the low ppb mixing ratios present in the urban environment (estimated detection limits <4ppb for CO and NO and <1ppb for NO2. Mead et al (submitted Aug., 2012)). Based on this work, a state of the art multi species instrument package for deployment in scalable sensor networks has been developed which has general applicability. This is currently being employed as part of a major 3 year UK program at London Heathrow airport (the Sensor Networks for Air Quality (SNAQ) Heathrow project). The main project outcome is the creation of a calibrated, high spatial and temporal resolution data set for O3, NO, NO2, SO2, CO, CO2, VOCstotal, size-speciated PM, temperature, relative humidity, wind speed and direction. The network incorporates existing GPRS infrastructures for real time sending of data with low overheads in terms of cost, effort and installation. In this paper we present data from the SNAQ Heathrow project as well as previous deployments showing measurement capability at the ppb level for NO, NO2 and CO. We show that variability can be observed and measured quantitatively using these sensor networks over widely differing time scales from individual emission events, diurnal variability associated with traffic and meteorological conditions, through to longer term synoptic weather conditions and seasonal behaviour. This work demonstrates a widely applicable generic capability to urban areas, airports as well as other complex emissions environments making this sensor system methodology valuable for scientific, policy and regulatory issues. We conclude that the low-cost high-density network philosophy has the potential to provide a more complete assessment of the high-granularity air quality structure generally observed in the environment. Further, when appropriately deployed, has the potential to offer a new paradigm in air quality quantification and monitoring.

  8. Discrete State Change Model of Manufacturing Quality to Aid Assembly Process Design

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    This paper proposes a representation model of the quality state change in an assembly process that can be used in a computer-aided process design system. In order to formalize the state change of the manufacturing quality in the assembly process, the functions, operations, and quality changes in the assembly process are represented as a network model that can simulate discrete events. This paper also develops a design method for the assembly process. The design method calculates the space of quality state change and outputs a better assembly process (better operations and better sequences) that can be used to obtain the intended quality state of the final product. A computational redesigning algorithm of the assembly process that considers the manufacturing quality is developed. The proposed method can be used to design an improved manufacturing process by simulating the quality state change. A prototype system for planning an assembly process is implemented and applied to the design of an auto-breaker assembly process. The result of the design example indicates that the proposed assembly process planning method outputs a better manufacturing scenario based on the simulation of the quality state change.

  9. Wireless Sensor Networks for Oceanographic Monitoring: A Systematic Review

    PubMed Central

    Albaladejo, Cristina; Sánchez, Pedro; Iborra, Andrés; Soto, Fulgencio; López, Juan A.; Torres, Roque

    2010-01-01

    Monitoring of the marine environment has come to be a field of scientific interest in the last ten years. The instruments used in this work have ranged from small-scale sensor networks to complex observation systems. Among small-scale networks, Wireless Sensor Networks (WSNs) are a highly attractive solution in that they are easy to deploy, operate and dismantle and are relatively inexpensive. The aim of this paper is to identify, appraise, select and synthesize all high quality research evidence relevant to the use of WSNs in oceanographic monitoring. The literature is systematically reviewed to offer an overview of the present state of this field of study and identify the principal resources that have been used to implement networks of this kind. Finally, this article details the challenges and difficulties that have to be overcome if these networks are to be successfully deployed. PMID:22163583

  10. Two years of LCOGT operations: the challenges of a global observatory

    NASA Astrophysics Data System (ADS)

    Volgenau, Nikolaus; Boroson, Todd

    2016-07-01

    With 18 telescopes distributed over 6 sites, and more telescopes being added in 2016, Las Cumbres Observatory Global Telescope Network is a unique resource for timedomain astronomy. The Network's continuous coverage of the night sky, and the optimization of the observing schedule over all sites simultaneously, have enabled LCOGTusers to produce significant science results. However, practical challenges to maximizing the Network's science output remain. The Network began providing observations for members of its Science Collaboration and other partners in May 2014. In the two years since then, LCOGT has made a number of improvements to increase the Network's science yield. We also now have two years' experience monitoring observatory performance; effective monitoring of an observatory that spans the globe is a complex enterprise. Here, we describe some of LCOGT's efforts to monitor the Network, assess the quality of science data, and improve communication with our users.

  11. A neural network approach to job-shop scheduling.

    PubMed

    Zhou, D N; Cherkassky, V; Baldwin, T R; Olson, D E

    1991-01-01

    A novel analog computational network is presented for solving NP-complete constraint satisfaction problems, i.e. job-shop scheduling. In contrast to most neural approaches to combinatorial optimization based on quadratic energy cost function, the authors propose to use linear cost functions. As a result, the network complexity (number of neurons and the number of resistive interconnections) grows only linearly with problem size, and large-scale implementations become possible. The proposed approach is related to the linear programming network described by D.W. Tank and J.J. Hopfield (1985), which also uses a linear cost function for a simple optimization problem. It is shown how to map a difficult constraint-satisfaction problem onto a simple neural net in which the number of neural processors equals the number of subjobs (operations) and the number of interconnections grows linearly with the total number of operations. Simulations show that the authors' approach produces better solutions than existing neural approaches to job-shop scheduling, i.e. the traveling salesman problem-type Hopfield approach and integer linear programming approach of J.P.S. Foo and Y. Takefuji (1988), in terms of the quality of the solution and the network complexity.

  12. Status report on the establishment of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) International Monitoring System (IMS) infrasound network

    NASA Astrophysics Data System (ADS)

    Vivas Veloso, J. A.; Christie, D. R.; Campus, P.; Bell, M.; Hoffmann, T. L.; Langlois, A.; Martysevich, P.; Demirovik, E.; Carvalho, J.; Kramer, A.

    2002-11-01

    The infrasound component of the International Monitoring System (IMS) for Comprehensive Nuclear-Test-Ban Treaty verification aims for global detection and localization of low-frequency sound waves originating from atmospheric nuclear explosions. The infrasound network will consist of 60 array stations, distributed as evenly as possible over the globe to assure at least two-station detection capability for 1-kton explosions at any point on earth. This network will be larger and more sensitive than any other previously operated infrasound network. As of today, 85% of the site surveys for IMS infrasound stations have been completed, 25% of the stations have been installed, and 8% of the installations have been certified and are transmitting high-quality continuous data to the International Data Center in Vienna. By the end of 2002, 20% of the infrasound network is expected to be certified and operating in post-certification mode. This presentation will discuss the current status and progress made in the site survey, installation, and certification programs for IMS infrasound stations. A review will be presented of the challenges and difficulties encountered in these programs, together with practical solutions to these problems.

  13. An Implementation of the Salt-Farm Monitoring System Using Wireless Sensor Network

    NASA Astrophysics Data System (ADS)

    Ju, Jonggil; Park, Ingon; Lee, Yongwoong; Cho, Jongsik; Cho, Hyunwook; Yoe, Hyun; Shin, Changsun

    In producing solar salt, natural environmental factors such as temperature, humidity, solar radiation, wind direction, wind speed and rain are essential elements which influence on the productivity and quality of salt. If we can manage the above mentioned environmental elements efficiently, we could achieve improved results in production of salt with good quality. To monitor and manage the natural environments, this paper suggests the Salt-Farm Monitoring System (SFMS) which is operated with renewable energy power. The system collects environmental factors directly from the environmental measure sensors and the sensor nodes. To implement a stand-alone system, we applied solar cell and wind generator to operate this system. Finally, we showed that the SFMS could monitor the salt-farm environments by using wireless sensor nodes and operate correctly without external power supply.

  14. Integrated Ocean Profile Data Delivery for Operations and Climate Research

    NASA Astrophysics Data System (ADS)

    Sun, C. L.; Soreide, N. N.

    2006-12-01

    An end-to-end data and information system for delivering integrated real-time and historical datasets is presented in this paper. The purposes of this paper are: (1) to illustrate the procedures of quality control and loading ocean profile data into the U.S. National Oceanographic Data Center (NODC) ocean database and (2) to facilitate the development and provision of a wide variety of useful data, analyses, and information products for operations and climate research. The NODC currently focuses on acquiring, processing, and distributing ocean profile data collected by two operational global ocean observing systems: Argo Profiling Network and Global Temperature-Salinity Profile Program (GTSPP). The two data streams contain upper ocean temperature and salinity data mainly from profiling floats, expendable bathythermographs (XBTs) but also from conductivity-temperature-depths (CTDs) and bottles. Argo has used resources from 23 or so countries to make unprecedented in-situ observations of the global ocean. All Argo data are publicly available in near real-time via the Global Telecommunications System (GTS) and in scientifically quality-controlled form with a few months delay. The NODC operates the Global Argo Data Repository for long-term archiving Argo data and serves the data in the NODC version of Argo netCDF and tab- delimited spreadsheet text formats to the public through the NODC Web site at http://www.nodc.noaa.gov/argo/. The GTSPP is a cooperative international program. It maintains a global ocean T-S resource with data that are both up-to-date and of the highest quality possible. Both real-time data transmitted over the GTS, and delayed- mode data received by contribution countries are acquired and quality controlled by the Marine Environmental Data Service, Canada and is eventually incorporated into a continuously managed database maintained by the NODC. Information and data are made publicly available at http://www.nodc.noaa.gov/GTSPP/ . Web-based tools are developed for allowing users on the Web to query and subset the data by parameter, location, time, and other attributes such as instrument types and quality flags. Desktop applications with capabilities of exploring data from real-time data streams and integrating the data streams with archives across the Internet are available for users who have a high bandwidth Internet connection. Alternatively, users without high-speed network access can order CD/DVD-ROMs from the NODC that contain the integrated dataset and then use software over potentially low-bandwidth network connection to periodically update the CD/DVD-ROM-based archive with new data

  15. Variable weight spectral amplitude coding for multiservice OCDMA networks

    NASA Astrophysics Data System (ADS)

    Seyedzadeh, Saleh; Rahimian, Farzad Pour; Glesk, Ivan; Kakaee, Majid H.

    2017-09-01

    The emergence of heterogeneous data traffic such as voice over IP, video streaming and online gaming have demanded networks with capability of supporting quality of service (QoS) at the physical layer with traffic prioritisation. This paper proposes a new variable-weight code based on spectral amplitude coding for optical code-division multiple-access (OCDMA) networks to support QoS differentiation. The proposed variable-weight multi-service (VW-MS) code relies on basic matrix construction. A mathematical model is developed for performance evaluation of VW-MS OCDMA networks. It is shown that the proposed code provides an optimal code length with minimum cross-correlation value when compared to other codes. Numerical results for a VW-MS OCDMA network designed for triple-play services operating at 0.622 Gb/s, 1.25 Gb/s and 2.5 Gb/s are considered.

  16. Bringing simulation to engineers in the field: a Web 2.0 approach.

    PubMed

    Haines, Robert; Khan, Kashif; Brooke, John

    2009-07-13

    Field engineers working on water distribution systems have to implement day-to-day operational decisions. Since pipe networks are highly interconnected, the effects of such decisions are correlated with hydraulic and water quality conditions elsewhere in the network. This makes the provision of predictive decision support tools (DSTs) for field engineers critical to optimizing the engineering work on the network. We describe how we created DSTs to run on lightweight mobile devices by using the Web 2.0 technique known as Software as a Service. We designed our system following the architectural style of representational state transfer. The system not only displays static geographical information system data for pipe networks, but also dynamic information and prediction of network state, by invoking and displaying the results of simulations running on more powerful remote resources.

  17. The GCOS Reference Upper-Air Network (GRUAN)

    NASA Astrophysics Data System (ADS)

    Vömel, H.; Berger, F. H.; Immler, F. J.; Seidel, D.; Thorne, P.

    2009-04-01

    While the global upper-air observing network has provided useful observations for operational weather forecasting for decades, its measurements lack the accuracy and long-term continuity needed for understanding climate change. Consequently, the scientific community faces uncertainty on such key issues as the trends of temperature in the upper troposphere and stratosphere or the variability and trends of stratospheric water vapour. To address these shortcomings, and to ensure that future climate records will be more useful than the records to date, the Global Climate Observing System (GCOS) program initiated the GCOS Reference Upper Air Network (GRUAN). GRUAN will be a network of about 30-40 observatories with a representative sampling of geographic regions and surface types. These stations will provide upper-air reference observations of the essential climate variables, i.e. temperature, geopotential, humidity, wind, radiation and cloud properties using specialized radiosondes and complementary remote sensing profiling instrumentation. Long-term stability, quality assurance / quality control, and a detailed assessment of measurement uncertainties will be the key aspects of GRUAN observations. The network will not be globally complete but will serve to constrain and adjust data from more spatially comprehensive global observing systems including satellites and the current radiosonde networks. This paper outlines the scientific rationale for GRUAN, its role in the Global Earth Observation System of Systems, network requirements and likely instrumentation, management structure, current status and future plans.

  18. PM2.5 Monitors in New England | Air Quality Planning Unit ...

    EPA Pesticide Factsheets

    2017-04-10

    The New England states are currently operating a network of 58 ambient PM2.5 air quality monitors that meet EPA's Federal Reference Method (FRM) for PM2.5, which is necessary in order for the resultant data to be used for attainment/non-attainment purposes. These monitors collect particles in the ambient air smaller than 2.5 microns in size on a filter, which is weighed prior and post sampling to produce a 24-hour sample concentration.

  19. 77 FR 23250 - HIT Standards Committee; Schedule for the Assessment of HIT Policy Committee Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-18

    ... quality, clinical operations, implementation, and privacy and security. Other groups are convened to address specific issues as needed, such as the Nationwide Health Information Network Power Team, the... appropriate workgroup or other special group to develop a report for the HIT Standards Committee, to the...

  20. Charter Management Organizations: An Emerging Approach to Scaling up What Works

    ERIC Educational Resources Information Center

    Farrell, Caitlin; Wohlstetter, Priscilla; Smith, Joanna

    2012-01-01

    Policymakers have shown increasing interest in replicating high-quality education models as a way to improve chronically underperforming schools. Charter management organizations (CMOs) have been touted as one organizational model poised to be such a vehicle for reform. CMOs are nonprofit organizations that operate a network of charter schools…

  1. Transmission of live laparoscopic surgery over the Internet2.

    PubMed

    Damore, L J; Johnson, J A; Dixon, R S; Iverson, M A; Ellison, E C; Melvin, W S

    1999-11-01

    Video broadcasting of surgical procedures is an important tool for education, training, and consultation. Current video conferencing systems are expensive and time-consuming and require preplanning. Real-time Internet video is known for its poor quality and relies on the equipment and the speed of the connection. The Internet2, a new high-speed (up to 2,048 Mbps), large bandwidth data network presently connects more than 100 universities and corporations. We have successfully used the Internet2 to broadcast the first real-time, high-quality audio/video program from a live laparoscopic operation to distant points. Video output of the laparoscopic camera and audio from a wireless microphone were broadcast to distant sites using a proprietary, PC-based implementation of H.320 video conferencing over a TCP/IP network connected to the Internet2. The receiving sites participated in two-way, real-time video and audio communications and graded the quality of the signal they received. On August 25, 1998, a laparoscopic Nissen fundoplication was transmitted to Internet2 stations in Colorado, Pennsylvania, and to an Internet station in New York. On September 28 and 29, 1998, we broadcast laparoscopic operations throughout both days to the Internet2 Fall Conference in San Francisco, California. Most recently, on February 24, 1999, we transmitted a laparoscopic Heller myotomy to the Abilene Network Launch Event in Washington, DC. The Internet2 is currently able to provide the bandwidth needed for a turn-key video conferencing system with high-resolution, real-time transmission. The system could be used for a variety of teaching and educational programs for experienced surgeons, residents, and medical students.

  2. Drinking water quality and formation of biofilms in an office building during its first year of operation, a full scale study.

    PubMed

    Inkinen, Jenni; Kaunisto, Tuija; Pursiainen, Anna; Miettinen, Ilkka T; Kusnetsov, Jaana; Riihinen, Kalle; Keinänen-Toivola, Minna M

    2014-02-01

    Complex interactions existing between water distribution systems' materials and water can cause a reduction in water quality and unwanted changes in materials, aging or corrosion of materials and formation of biofilms on surfaces. Substances leaching from pipe materials and water fittings, as well as the microbiological quality of water and formation of biofilms were evaluated by applying a Living Lab theme i.e. a research in a real life setting using a full scale system during its first year of operation. The study site was a real office building with one part of the building lined with copper pipes, the other with cross-linked polyethylene (PEX) pipes thus enabling material comparison; also differences within the cold and hot water systems were analysed. It was found that operational conditions, such as flow conditions and temperature affected the amounts of metals leaching from the pipe network. In particular, brass components were considered to be a source of leaching; e. g. the lead concentration was highest during the first few weeks after the commissioning of the pipe network when the water was allowed to stagnate. Assimilable organic carbon (AOC) and microbially available phosphorus (MAP) were found to leach from PEX pipelines with minor effects on biomass of the biofilm. Cultivable and viable biomass (heterotrophic plate count (HPC), and adenosine triphosphate (ATP)) levels in biofilms were higher in the cold than in the hot water system whereas total microbial biomass (total cell count (DAPI)) was similar with both systems. The type of pipeline material was not found to greatly affect the microbial biomass or Alpha-, Beta- and Gammaproteobacteria profiles (16s rRNA gene copies) after the first one year of operation. Also microbiological quality of water was found to deteriorate due to stagnation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Knapsack - TOPSIS Technique for Vertical Handover in Heterogeneous Wireless Network

    PubMed Central

    2015-01-01

    In a heterogeneous wireless network, handover techniques are designed to facilitate anywhere/anytime service continuity for mobile users. Consistent best-possible access to a network with widely varying network characteristics requires seamless mobility management techniques. Hence, the vertical handover process imposes important technical challenges. Handover decisions are triggered for continuous connectivity of mobile terminals. However, bad network selection and overload conditions in the chosen network can cause fallout in the form of handover failure. In order to maintain the required Quality of Service during the handover process, decision algorithms should incorporate intelligent techniques. In this paper, a new and efficient vertical handover mechanism is implemented using a dynamic programming method from the operation research discipline. This dynamic programming approach, which is integrated with the Technique to Order Preference by Similarity to Ideal Solution (TOPSIS) method, provides the mobile user with the best handover decisions. Moreover, in this proposed handover algorithm a deterministic approach which divides the network into zones is incorporated into the network server in order to derive an optimal solution. The study revealed that this method is found to achieve better performance and QoS support to users and greatly reduce the handover failures when compared to the traditional TOPSIS method. The decision arrived at the zone gateway using this operational research analytical method (known as the dynamic programming knapsack approach together with Technique to Order Preference by Similarity to Ideal Solution) yields remarkably better results in terms of the network performance measures such as throughput and delay. PMID:26237221

  4. Knapsack--TOPSIS Technique for Vertical Handover in Heterogeneous Wireless Network.

    PubMed

    Malathy, E M; Muthuswamy, Vijayalakshmi

    2015-01-01

    In a heterogeneous wireless network, handover techniques are designed to facilitate anywhere/anytime service continuity for mobile users. Consistent best-possible access to a network with widely varying network characteristics requires seamless mobility management techniques. Hence, the vertical handover process imposes important technical challenges. Handover decisions are triggered for continuous connectivity of mobile terminals. However, bad network selection and overload conditions in the chosen network can cause fallout in the form of handover failure. In order to maintain the required Quality of Service during the handover process, decision algorithms should incorporate intelligent techniques. In this paper, a new and efficient vertical handover mechanism is implemented using a dynamic programming method from the operation research discipline. This dynamic programming approach, which is integrated with the Technique to Order Preference by Similarity to Ideal Solution (TOPSIS) method, provides the mobile user with the best handover decisions. Moreover, in this proposed handover algorithm a deterministic approach which divides the network into zones is incorporated into the network server in order to derive an optimal solution. The study revealed that this method is found to achieve better performance and QoS support to users and greatly reduce the handover failures when compared to the traditional TOPSIS method. The decision arrived at the zone gateway using this operational research analytical method (known as the dynamic programming knapsack approach together with Technique to Order Preference by Similarity to Ideal Solution) yields remarkably better results in terms of the network performance measures such as throughput and delay.

  5. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized by two large flux networks in China (National Academy of Sciences & Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe.

  6. Adaptive Energy Forecasting and Information Diffusion for Smart Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Agarwal, Vaibhav; Aman, Saim

    2012-05-16

    Smart Power Grids exemplify an emerging class of Cyber Physical Applications that exhibit dynamic, distributed and data intensive (D3) characteristics along with an always-on paradigm to support operational needs. Smart Grids are an outcome of instrumentation, such as Phasor Measurement Units and Smart Power Meters, that is being deployed across the transmission and distribution network of electric grids. These sensors provide utilities with improved situation awareness on near-realtime electricity usage by individual consumers, and the power quality and stability of the transmission network.

  7. The school bus routing and scheduling problem with transfers

    PubMed Central

    Doerner, Karl F.; Parragh, Sophie N.

    2015-01-01

    In this article, we study the school bus routing and scheduling problem with transfers arising in the field of nonperiodic public transportation systems. It deals with the transportation of pupils from home to their school in the morning taking the possibility that pupils may change buses into account. Allowing transfers has several consequences. On the one hand, it allows more flexibility in the bus network structure and can, therefore, help to reduce operating costs. On the other hand, transfers have an impact on the service level: the perceived service quality is lower due to the existence of transfers; however, at the same time, user ride times may be reduced and, thus, transfers may also have a positive impact on service quality. The main objective is the minimization of the total operating costs. We develop a heuristic solution framework to solve this problem and compare it with two solution concepts that do not consider transfers. The impact of transfers on the service level in terms of time loss (or user ride time) and the number of transfers is analyzed. Our results show that allowing transfers reduces total operating costs significantly while average and maximum user ride times are comparable to solutions without transfers. © 2015 Wiley Periodicals, Inc. NETWORKS, Vol. 65(2), 180–203 2015 PMID:28163329

  8. Predicting PM10 concentration in Seoul metropolitan subway stations using artificial neural network (ANN).

    PubMed

    Park, Sechan; Kim, Minjeong; Kim, Minhae; Namgung, Hyeong-Gyu; Kim, Ki-Tae; Cho, Kyung Hwa; Kwon, Soon-Bark

    2018-01-05

    The indoor air quality of subway systems can significantly affect the health of passengers since these systems are widely used for short-distance transit in metropolitan urban areas in many countries. The particles generated by abrasion during subway operations and the vehicle-emitted pollutants flowing in from the street in particular affect the air quality in underground subway stations. Thus the continuous monitoring of particulate matter (PM) in underground station is important to evaluate the exposure level of PM to passengers. However, it is difficult to obtain indoor PM data because the measurement systems are expensive and difficult to install and operate for significant periods of time in spaces crowded with people. In this study, we predicted the indoor PM concentration using the information of outdoor PM, the number of subway trains running, and information on ventilation operation by the artificial neural network (ANN) model. As well, we investigated the relationship between ANN's performance and the depth of underground subway station. ANN model showed a high correlation between the predicted and actual measured values and it was able to predict 67∼80% of PM at 6 subway station. In addition, we found that platform shape and depth influenced the model performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Long-running telemedicine networks delivering humanitarian services: experience, performance and scientific output

    PubMed Central

    Geissbuhler, Antoine; Jethwani, Kamal; Kovarik, Carrie; Person, Donald A; Vladzymyrskyy, Anton; Zanaboni, Paolo; Zolfo, Maria

    2012-01-01

    Abstract Objective To summarize the experience, performance and scientific output of long-running telemedicine networks delivering humanitarian services. Methods Nine long-running networks – those operating for five years or more– were identified and seven provided detailed information about their activities, including performance and scientific output. Information was extracted from peer-reviewed papers describing the networks’ study design, effectiveness, quality, economics, provision of access to care and sustainability. The strength of the evidence was scored as none, poor, average or good. Findings The seven networks had been operating for a median of 11 years (range: 5–15). All networks provided clinical tele-consultations for humanitarian purposes using store-and-forward methods and five were also involved in some form of education. The smallest network had 15 experts and the largest had more than 500. The clinical caseload was 50 to 500 cases a year. A total of 59 papers had been published by the networks, and 44 were listed in Medline. Based on study design, the strength of the evidence was generally poor by conventional standards (e.g. 29 papers described non-controlled clinical series). Over half of the papers provided evidence of sustainability and improved access to care. Uncertain funding was a common risk factor. Conclusion Improved collaboration between networks could help attenuate the lack of resources reported by some networks and improve sustainability. Although the evidence base is weak, the networks appear to offer sustainable and clinically useful services. These findings may interest decision-makers in developing countries considering starting, supporting or joining similar telemedicine networks. PMID:22589567

  10. Developing a risk stratification tool for audit of outcome after surgery for head and neck squamous cell carcinoma.

    PubMed

    Tighe, David F; Thomas, Alan J; Sassoon, Isabel; Kinsman, Robin; McGurk, Mark

    2017-07-01

    Patients treated surgically for head and neck squamous cell carcinoma (HNSCC) represent a heterogeneous group. Adjusting for patient case mix and complexity of surgery is essential if reporting outcomes represent surgical performance and quality of care. A case note audit totaling 1075 patients receiving 1218 operations done for HNSCC in 4 cancer networks was completed. Logistic regression, decision tree analysis, an artificial neural network, and Naïve Bayes Classifier were used to adjust for patient case-mix using pertinent preoperative variables. Thirty-day complication rates varied widely (34%-51%; P < .015) between units. The predictive models allowed risk stratification. The artificial neural network demonstrated the best predictive performance (area under the curve [AUC] 0.85). Early postoperative complications are a measurable outcome that can be used to benchmark surgical performance and quality of care. Surgical outcome reporting in national clinical audits should be taking account of the patient case mix. © 2017 Wiley Periodicals, Inc.

  11. Joint Mobile Network Operations: Routing Design and Quality of Service Configuration

    DTIC Science & Technology

    2007-09-01

    EF service for the desktop VTC application, CU- SeeMe , which uses UDP packets on ports 7648 and 7649. We also might want to provide AF service to...between commanders. In this case, the example application used is CU- SeeMe , which operates through UDP on ports 7648, 7649, or 24032. The required...range 7648 7649 access-list 101 permit udp any any eq 24032 Matches all CU- SeeMe traffic from any host access-list 102 permit udp 192.168.32.0

  12. A Review of Power Distribution Test Feeders in the United States and the Need for Synthetic Representative Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Postigo Marcos, Fernando E.; Domingo, Carlos Mateo; San Roman, Tomas Gomez

    Under the increasing penetration of distributed energy resources and new smart network technologies, distribution utilities face new challenges and opportunities to ensure reliable operations, manage service quality, and reduce operational and investment costs. Simultaneously, the research community is developing algorithms for advanced controls and distribution automation that can help to address some of these challenges. However, there is a shortage of realistic test systems that are publically available for development, testing, and evaluation of such new algorithms. Concerns around revealing critical infrastructure details and customer privacy have severely limited the number of actual networks published and that are available formore » testing. In recent decades, several distribution test feeders and US-featured representative networks have been published, but the scale, complexity, and control data vary widely. This paper presents a first-of-a-kind structured literature review of published distribution test networks with a special emphasis on classifying their main characteristics and identifying the types of studies for which they have been used. As a result, this both aids researchers in choosing suitable test networks for their needs and highlights the opportunities and directions for further test system development. In particular, we highlight the need for building large-scale synthetic networks to overcome the identified drawbacks of current distribution test feeders.« less

  13. A Review of Power Distribution Test Feeders in the United States and the Need for Synthetic Representative Networks

    DOE PAGES

    Postigo Marcos, Fernando E.; Domingo, Carlos Mateo; San Roman, Tomas Gomez; ...

    2017-11-18

    Under the increasing penetration of distributed energy resources and new smart network technologies, distribution utilities face new challenges and opportunities to ensure reliable operations, manage service quality, and reduce operational and investment costs. Simultaneously, the research community is developing algorithms for advanced controls and distribution automation that can help to address some of these challenges. However, there is a shortage of realistic test systems that are publically available for development, testing, and evaluation of such new algorithms. Concerns around revealing critical infrastructure details and customer privacy have severely limited the number of actual networks published and that are available formore » testing. In recent decades, several distribution test feeders and US-featured representative networks have been published, but the scale, complexity, and control data vary widely. This paper presents a first-of-a-kind structured literature review of published distribution test networks with a special emphasis on classifying their main characteristics and identifying the types of studies for which they have been used. As a result, this both aids researchers in choosing suitable test networks for their needs and highlights the opportunities and directions for further test system development. In particular, we highlight the need for building large-scale synthetic networks to overcome the identified drawbacks of current distribution test feeders.« less

  14. Demonstration of application-driven network slicing and orchestration in optical/packet domains: on-demand vDC expansion for Hadoop MapReduce optimization.

    PubMed

    Kong, Bingxin; Liu, Siqi; Yin, Jie; Li, Shengru; Zhu, Zuqing

    2018-05-28

    Nowadays, it is common for service providers (SPs) to leverage hybrid clouds to improve the quality-of-service (QoS) of their Big Data applications. However, for achieving guaranteed latency and/or bandwidth in its hybrid cloud, an SP might desire to have a virtual datacenter (vDC) network, in which it can manage and manipulate the network connections freely. To address this requirement, we design and implement a network slicing and orchestration (NSO) system that can create and expand vDCs across optical/packet domains on-demand. Considering Hadoop MapReduce (M/R) as the use-case, we describe the proposed architectures of the system's data, control and management planes, and present the operation procedures for creating, expanding, monitoring and managing a vDC for M/R optimization. The proposed NSO system is then realized in a small-scale network testbed that includes four optical/packet domains, and we conduct experiments in it to demonstrate the whole operations of the data, control and management planes. Our experimental results verify that application-driven on-demand vDC expansion across optical/packet domains can be achieved for M/R optimization, and after being provisioned with a vDC, the SP using the NSO system can fully control the vDC network and further optimize the M/R jobs in it with network orchestration.

  15. Planning of distributed generation in distribution network based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Li, Jinze; Qu, Zhi; He, Xiaoyang; Jin, Xiaoming; Li, Tie; Wang, Mingkai; Han, Qiu; Gao, Ziji; Jiang, Feng

    2018-02-01

    Large-scale access of distributed power can improve the current environmental pressure, at the same time, increasing the complexity and uncertainty of overall distribution system. Rational planning of distributed power can effectively improve the system voltage level. To this point, the specific impact on distribution network power quality caused by the access of typical distributed power was analyzed and from the point of improving the learning factor and the inertia weight, an improved particle swarm optimization algorithm (IPSO) was proposed which could solve distributed generation planning for distribution network to improve the local and global search performance of the algorithm. Results show that the proposed method can well reduce the system network loss and improve the economic performance of system operation with distributed generation.

  16. Elizabeth City State University: Elizabeth City, North Carolina (Data)

    DOE Data Explorer

    Stoffel, T.; Andreas, A.

    1985-09-25

    The Historically Black Colleges and Universities (HBCU) Solar Radiation Monitoring Network operated from July 1985 through December 1996. Funded by DOE, the six-station network provided 5-minute averaged measurements of direct normal, global, and diffuse horizontal solar irradiance. The data were processed at NREL to improve the assessment of the solar radiation resources in the southeastern United States. Historical HBCU data available online include quality assessed 5-min data, monthly reports, and plots. In January 1997 the HBCU sites became part of the CONFRRM solar monitoring network and data from the two remaining active stations, Bluefield State College and Elizabeth City State University, are collected by the NREL Measurement & Instrumentation Data Center (MIDC).

  17. Bluefield State College: Bluefield, West Virginia (Data)

    DOE Data Explorer

    Stoffel, T.; Andreas, A.

    1985-11-06

    The Historically Black Colleges and Universities (HBCU) Solar Radiation Monitoring Network operated from July 1985 through December 1996. Funded by DOE, the six-station network provided 5-minute averaged measurements of direct normal, global, and diffuse horizontal solar irradiance. The data were processed at NREL to improve the assessment of the solar radiation resources in the southeastern United States. Historical HBCU data available online include quality assessed 5-min data, monthly reports, and plots. In January 1997 the HBCU sites became part of the CONFRRM solar monitoring network and data from the two remaining active stations, Bluefield State College and Elizabeth City State University, are collected by the NREL Measurement & Instrumentation Data Center (MIDC).

  18. ERMHAN: A Context-Aware Service Platform to Support Continuous Care Networks for Home-Based Assistance

    PubMed Central

    Paganelli, Federica; Spinicci, Emilio; Giuli, Dino

    2008-01-01

    Continuous care models for chronic diseases pose several technology-oriented challenges for home-based continuous care, where assistance services rely on a close collaboration among different stakeholders such as health operators, patient relatives, and social community members. Here we describe Emilia Romagna Mobile Health Assistance Network (ERMHAN) a multichannel context-aware service platform designed to support care networks in cooperating and sharing information with the goal of improving patient quality of life. In order to meet extensibility and flexibility requirements, this platform has been developed through ontology-based context-aware computing and a service oriented approach. We also provide some preliminary results of performance analysis and user survey activity. PMID:18695739

  19. Optimal Power Scheduling for a Medium Voltage AC/DC Hybrid Distribution Network

    DOE PAGES

    Zhu, Zhenshan; Liu, Dichen; Liao, Qingfen; ...

    2018-01-26

    With the great increase of renewable generation as well as the DC loads in the distribution network; DC distribution technology is receiving more attention; since the DC distribution network can improve operating efficiency and power quality by reducing the energy conversion stages. This paper presents a new architecture for the medium voltage AC/DC hybrid distribution network; where the AC and DC subgrids are looped by normally closed AC soft open point (ACSOP) and DC soft open point (DCSOP); respectively. The proposed AC/DC hybrid distribution systems contain renewable generation (i.e., wind power and photovoltaic (PV) generation); energy storage systems (ESSs); softmore » open points (SOPs); and both AC and DC flexible demands. An energy management strategy for the hybrid system is presented based on the dynamic optimal power flow (DOPF) method. The main objective of the proposed power scheduling strategy is to minimize the operating cost and reduce the curtailment of renewable generation while meeting operational and technical constraints. The proposed approach is verified in five scenarios. The five scenarios are classified as pure AC system; hybrid AC/DC system; hybrid system with interlinking converter; hybrid system with DC flexible demand; and hybrid system with SOPs. Results show that the proposed scheduling method can successfully dispatch the controllable elements; and that the presented architecture for the AC/DC hybrid distribution system is beneficial for reducing operating cost and renewable generation curtailment.« less

  20. Optimal Power Scheduling for a Medium Voltage AC/DC Hybrid Distribution Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Zhenshan; Liu, Dichen; Liao, Qingfen

    With the great increase of renewable generation as well as the DC loads in the distribution network; DC distribution technology is receiving more attention; since the DC distribution network can improve operating efficiency and power quality by reducing the energy conversion stages. This paper presents a new architecture for the medium voltage AC/DC hybrid distribution network; where the AC and DC subgrids are looped by normally closed AC soft open point (ACSOP) and DC soft open point (DCSOP); respectively. The proposed AC/DC hybrid distribution systems contain renewable generation (i.e., wind power and photovoltaic (PV) generation); energy storage systems (ESSs); softmore » open points (SOPs); and both AC and DC flexible demands. An energy management strategy for the hybrid system is presented based on the dynamic optimal power flow (DOPF) method. The main objective of the proposed power scheduling strategy is to minimize the operating cost and reduce the curtailment of renewable generation while meeting operational and technical constraints. The proposed approach is verified in five scenarios. The five scenarios are classified as pure AC system; hybrid AC/DC system; hybrid system with interlinking converter; hybrid system with DC flexible demand; and hybrid system with SOPs. Results show that the proposed scheduling method can successfully dispatch the controllable elements; and that the presented architecture for the AC/DC hybrid distribution system is beneficial for reducing operating cost and renewable generation curtailment.« less

  1. Fuzzy-driven energy storage system for mitigating voltage unbalance factor on distribution network with photovoltaic system

    NASA Astrophysics Data System (ADS)

    Wong, Jianhui; Lim, Yun Seng; Morris, Stella; Morris, Ezra; Chua, Kein Huat

    2017-04-01

    The amount of small-scaled renewable energy sources is anticipated to increase on the low-voltage distribution networks for the improvement of energy efficiency and reduction of greenhouse gas emission. The growth of the PV systems on the low-voltage distribution networks can create voltage unbalance, voltage rise, and reverse-power flow. Usually these issues happen with little fluctuation. However, it tends to fluctuate severely as Malaysia is a region with low clear sky index. A large amount of clouds often passes over the country, hence making the solar irradiance to be highly scattered. Therefore, the PV power output fluctuates substantially. These issues can lead to the malfunction of the electronic based equipment, reduction in the network efficiency and improper operation of the power protection system. At the current practice, the amount of PV system installed on the distribution network is constraint by the utility company. As a result, this can limit the reduction of carbon footprint. Therefore, energy storage system is proposed as a solution for these power quality issues. To ensure an effective operation of the distribution network with PV system, a fuzzy control system is developed and implemented to govern the operation of an energy storage system. The fuzzy driven energy storage system is able to mitigate the fluctuating voltage rise and voltage unbalance on the electrical grid by actively manipulates the flow of real power between the grid and the batteries. To verify the effectiveness of the proposed fuzzy driven energy storage system, an experimental network integrated with 7.2kWp PV system was setup. Several case studies are performed to evaluate the response of the proposed solution to mitigate voltage rises, voltage unbalance and reduce the amount of reverse power flow under highly intermittent PV power output.

  2. Using a Network of Boundary Layer Profilers to Characterize the Atmosphere at a Major Spaceport

    NASA Technical Reports Server (NTRS)

    Case, Jonathan L.; Lambert, Winifred; Merceret, Francis; Ward, Jennifer

    2006-01-01

    Space launch, landing, and ground operations at the Kennedy Space Center (KSC) and Cape Canaveral Air Force Station (CCAFS) in east-central Florida are highly sensitive to mesoscale weather conditions throughout the year. Due to the complex land-water interfaces and the important role of mesoscale circulations, a high-resolution network of five 915-MHz Doppler Radar Wind Profilers (DRWP) and 44 wind towers was installed over the KSC/CCAFS area. By using quality-controlled 915-MHz DRAT data along with the near-surface tower observations, the Applied Meteorology Unit and KSC Weather Office have studied the development and evolution of various mesoscale phenomena across KSC/CCAFS such as sea and land breezes, low-level jets, and frontal passages. This paper will present some examples of mesoscale phenomena that can impact space operations at KSC/CCAFS, focusing on the utility of the 915-MHz DRWP network in identifying important characteristics of sea/land breezes and low-level jets.

  3. Near-real-time processing of a ceilometer network assisted with sun-photometer data: monitoring a dust outbreak over the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Cazorla, Alberto; Andrés Casquero-Vera, Juan; Román, Roberto; Guerrero-Rascado, Juan Luis; Toledano, Carlos; Cachorro, Victoria E.; Orza, José Antonio G.; Cancillo, María Luisa; Serrano, Antonio; Titos, Gloria; Pandolfi, Marco; Alastuey, Andres; Hanrieder, Natalie; Alados-Arboledas, Lucas

    2017-10-01

    The interest in the use of ceilometers for optical aerosol characterization has increased in the last few years. They operate continuously almost unattended and are also much less expensive than lidars; hence, they can be distributed in dense networks over large areas. However, due to the low signal-to-noise ratio it is not always possible to obtain particle backscatter coefficient profiles, and the vast number of data generated require an automated and unsupervised method that ensures the quality of the profiles inversions. In this work we describe a method that uses aerosol optical depth (AOD) measurements from the AERONET network that it is applied for the calibration and automated quality assurance of inversion of ceilometer profiles. The method is compared with independent inversions obtained by co-located multiwavelength lidar measurements. A difference smaller than 15 % in backscatter is found between both instruments. This method is continuously and automatically applied to the Iberian Ceilometer Network (ICENET) and a case example during an unusually intense dust outbreak affecting the Iberian Peninsula between 20 and 24 February 2016 is shown. Results reveal that it is possible to obtain quantitative optical aerosol properties (particle backscatter coefficient) and discriminate the quality of these retrievals with ceilometers over large areas. This information has a great potential for alert systems and model assimilation and evaluation.

  4. Generation and use of observational data patterns in the evaluation of data quality for AmeriFlux and FLUXNET

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Agarwal, D.; Poindexter, C.; Papale, D.; Trotta, C.; Ribeca, A.; Canfora, E.; Faybishenko, B.; Gunter, D.; Chu, H.

    2015-12-01

    The fluxes-measuring sites that are part of AmeriFlux are operated and maintained in a fairly independent fashion, both in terms of scientific goals and operational practices. This is also the case for most sites from other networks in FLUXNET. This independence leads to a degree of heterogeneity in the data sets collected at the sites, which is also reflected in data quality levels. The generation of derived data products and data synthesis efforts, two of the main goals of these networks, are directly affected by the heterogeneity in data quality. In a collaborative effort between AmeriFlux and ICOS, a series of quality checks are being conducted for the data sets before any network-level data processing and product generation take place. From these checks, a set of common data issues were identified, and are being cataloged and classified into data quality patterns. These patterns are now being used as a basis for implementing automation for certain data quality checks, speeding up the process of applying the checks and evaluating the data. Currently, most data checks are performed individually in each data set, requiring visual inspection and inputs from a data curator. This manual process makes it difficult to scale the quality checks, creating a bottleneck for the data processing. One goal of the automated checks is to free up time of data curators so they can focus on new or less common issues. As new issues are identified, they can also be cataloged and classified, extending the coverage of existing patterns or potentially generating new patterns, helping both improve existing automated checks and create new ones. This approach is helping make data quality evaluation faster, more systematic, and reproducible. Furthermore, these patterns are also helping with documenting common causes and solutions for data problems. This can help tower teams with diagnosing problems in data collection and processing, and also in correcting historical data sets. In this presentation, using AmeriFlux fluxes and micrometeorological data, we discuss our approach to creating observational data patterns, and how we are using them to implement new automated checks. We also detail examples of these observational data patterns, illustrating how they are being used.

  5. Networking seismological data exchange in Europe

    NASA Astrophysics Data System (ADS)

    Sleeman, Reinoud; van Eck, Torild; van den Hazel, Gert-Jan; Trani, Luca; Spinuso, Alessandro

    2010-05-01

    The mission of the ORFEUS Data Centre (ODC) is to collect and archive high-quality seismic broadband waveform data from European-Mediterranean organizations and to provide open access to this data for monitoring and research purposes by the scientific community. The core activity of the ODC is to run an automatic, sustainable system to achieve this mission. Our 4 key operations are: data exchange protocols, quality control procedures, data management and data services. All these activities at the ODC benefit from developments within the EC Infrastructure (I3) project NERIES (Network of Research Infrastructure for European Seismology). For the data acquisition the ODC uses different standard, real-time data exchange protocols (e.g. Antelope, SeedLink, Scream) to ensure a very high data availability from stations in the Virtual European Broadband Seismic Network (VEBSN), which currently exists of about 500 BB stations. Within the data services a number of tools (e.g. Wilber II, NetDC, BreqFast, AutoDRM and webforms) are in place to serve the scientific community. These are currently being complemented by webservices and an integrated portal. The data management part relies on a simple flat file structure and a MySQL data management system on which both ArcLink and the Generic Data Interface (GDI) operate. In this presentation we will present an overview of the different aspects concerning data acquisition, services and management at ODC.

  6. Complex adaptive systems: a tool for interpreting responses and behaviours.

    PubMed

    Ellis, Beverley

    2011-01-01

    Quality improvement is a priority for health services worldwide. There are many barriers to implementing change at the locality level and misinterpreting responses and behaviours can effectively block change. Electronic health records will influence the means by which knowledge and information are generated and sustained among those operating quality improvement programmes. To explain how complex adaptive system (CAS) theory provides a useful tool and new insight into the responses and behaviours that relate to quality improvement programmes in primary care enabled by informatics. Case studies in two English localities who participated in the implementation and development of quality improvement programmes. The research strategy included purposefully sampled case studies, conducted within a social constructionist ontological perspective. Responses and behaviours of quality improvement programmes in the two localities include both positive and negative influences associated with a networked model of governance. Pressures of time, resources and workload are common issues, along with the need for education and training about capturing, coding, recording and sharing information held within electronic health records to support various information requirements. Primary care informatics enables information symmetry among those operating quality improvement programmes by making some aspects of care explicit, allowing consensus about quality improvement priorities and implementable solutions.

  7. Technology acceptance perception for promotion of sustainable consumption.

    PubMed

    Biswas, Aindrila; Roy, Mousumi

    2018-03-01

    Economic growth in the past decades has resulted in change in consumption pattern and emergence of tech-savvy generation with unprecedented increase in the usage of social network technology. In this paper, the technology acceptance value gap adapted from the technology acceptance model has been applied as a tool supporting social network technology usage and subsequent promotion of sustainable consumption. The data generated through the use of structured questionnaires have been analyzed using structural equation modeling. The validity of the model and path estimates signifies the robustness of Technology Acceptance value gap in adjudicating the efficiency of social network technology usage in augmentation of sustainable consumption and awareness. The results indicate that subjective norm gap, ease-of-operation gap, and quality of green information gap have the most adversarial impact on social network technology usage. Eventually social networking technology usage has been identified as a significant antecedent of sustainable consumption.

  8. Measurements of Atmospheric NH3, NOy/NOx, and NO2 and Deposition of Total Nitrogen at the Beaufort, NC CASTNET Site (BFT142)

    EPA Science Inventory

    The Clean Air Status and Trends Network (CASTNET) is a long-term environmental monitoring program that measures trends in ambient air quality and atmospheric dry pollutant deposition across the United States. CASTNET has been operating since 1987 and currently consists of 89 moni...

  9. Air quality indices from ERTS-1 MSS information

    NASA Technical Reports Server (NTRS)

    Riley, E. L.; Stryker, S.; Ward, E. A.

    1973-01-01

    Comparison between ground based atmospheric turbidity network measurements and the average scene grayness from MSS Channel 4 data is in progress. Correlation between these two sources is promising. If continued correlation occurs for other ERTS-1 overflight dates and ground test sites, a new operational use of ERTS-1 useful to Federal, state, and international organizations will become available.

  10. The Caloosahatchee River Estuary: a monitoring partnership between Federal, State, and local governments, 2007-13

    USGS Publications Warehouse

    Patino, Eduardo

    2014-01-01

    From 2007 to 2013, the U.S. Geological Survey (USGS), in cooperation with the Florida Department of Environmental Protection (FDEP) and the South Florida Water Management District (SFWMD), operated a flow and salinity monitoring network at tributaries flowing into and at key locations within the tidal Caloosahatchee River. This network was designed to supplement existing long-term monitoring stations, such as W.P. Franklin Lock, also known as S–79, which are operated by the USGS in cooperation with the U.S. Army Corps of Engineers, Lee County, and the City of Cape Coral. Additionally, a monitoring station was operated on Sanibel Island from 2010 to 2013 as part of the USGS Greater Everglades Priority Ecosystem Science initiative and in partnership with U.S. Fish and Wildlife Service (J.N. Ding Darling National Wildlife Refuge). Moving boat water-quality surveys throughout the tidal Caloosahatchee River and downstream estuaries began in 2011 and are ongoing. Information generated by these monitoring networks has proved valuable to the FDEP for developing total maximum daily load criteria, and to the SFWMD for calibrating and verifying a hydrodynamic model. The information also supports the Caloosahatchee River Watershed Protection Plan.

  11. Establishing a China malaria diagnosis reference laboratory network for malaria elimination.

    PubMed

    Yin, Jian-hai; Yan, He; Huang, Fang; Li, Mei; Xiao, Hui-hui; Zhou, Shui-sen; Xia, Zhi-gui

    2015-01-28

    In China, the prevalence of malaria has reduced dramatically due to the elimination programme. The continued success of the programme will depend upon the accurate diagnosis of the disease in the laboratory. The basic requirements for this are a reliable malaria diagnosis laboratory network and quality management system to support case verification and source tracking. The baseline information of provincial malaria laboratories in the China malaria diagnosis reference laboratory network was collected and analysed, and a quality-assurance activity was carried out to assess their accuracies in malaria diagnosis by microscopy using WHO standards and PCR. By the end of 2013, nineteen of 24 provincial laboratories have been included in the network. In the study, a total of 168 staff were registered and there was no bias in their age, gender, education level, and position. Generally Plasmodium species were identified with great accuracy by microscopy and PCR. However, Plasmodium ovale was likely to be misdiagnosed as Plasmodium vivax by microscopy. China has established a laboratory network for primary malaria diagnosis which will cover a larger area. Currently, Plasmodium species can be identified fairly accurately by microscopy and PCR. However, laboratory staff need additional trainings on accurate identification of P. ovale microscopically and good performance of PCR operations.

  12. Quality of Surface Water in Missouri, Water Year 2008

    USGS Publications Warehouse

    Otero-Benitez, William; Davis, Jerri V.

    2009-01-01

    The U.S. Geological Survey, in cooperation with the Missouri Department of Natural Resources, designed and operates a series of monitoring stations on streams throughout Missouri known as the Ambient Water-Quality Monitoring Network. During the 2008 water year (October 1, 2007, through September 30, 2008), data were collected at 67 stations, including two U.S. Geological Survey National Stream Quality Accounting Network stations and one spring sampled in cooperation with the U.S. Forest Service. Dissolved oxygen, specific conductance, water temperature, suspended solids, suspended sediment, fecal coliform bacteria, Escherichia coli bacteria, dissolved nitrate plus nitrite, total phosphorus, dissolved and total recoverable lead and zinc, and selected pesticide data summaries are presented for 64 of these stations. The stations primarily have been classified into groups corresponding to the physiography of the State, primary land use, or unique station types. In addition, a summary of hydrologic conditions in the State including peak discharges, monthly mean discharges, and seven-day low flow is presented.

  13. A reliable low cost integrated wireless sensor network for water quality monitoring and level control system in UAE

    NASA Astrophysics Data System (ADS)

    Abou-Elnour, Ali; Khaleeq, Hyder; Abou-Elnour, Ahmad

    2016-04-01

    In the present work, wireless sensor network and real-time controlling and monitoring system are integrated for efficient water quality monitoring for environmental and domestic applications. The proposed system has three main components (i) the sensor circuits, (ii) the wireless communication system, and (iii) the monitoring and controlling unit. LabView software has been used in the implementation of the monitoring and controlling system. On the other hand, ZigBee and myRIO wireless modules have been used to implement the wireless system. The water quality parameters are accurately measured by the present computer based monitoring system and the measurement results are instantaneously transmitted and published with minimum infrastructure costs and maximum flexibility in term of distance or location. The mobility and durability of the proposed system are further enhanced by fully powering via a photovoltaic system. The reliability and effectiveness of the system are evaluated under realistic operating conditions.

  14. The Assessment of Instruments for Detecting Surface Water Spills Associated with Oil and Gas Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Aubrey E.; Hopkinson, Leslie; Soeder, Daniel

    Surface water and groundwater risks associated with unconventional oil and gas development result from potential spills of the large volumes of chemicals stored on-site during drilling and hydraulic fracturing operations, and the return to the surface of significant quantities of saline water produced during oil or gas well production. To better identify and mitigate risks, watershed models and tools are needed to evaluate the dispersion of pollutants in possible spill scenarios. This information may be used to determine the placement of in-stream water-quality monitoring instruments and to develop early-warning systems and emergency plans. A chemical dispersion model has been usedmore » to estimate the contaminant signal for in-stream measurements. Spills associated with oil and gas operations were identified within the Susquehanna River Basin Commission’s Remote Water Quality Monitoring Network. The volume of some contaminants was found to be sufficient to affect the water quality of certain drainage areas. The most commonly spilled compounds and expected peak concentrations at monitoring stations were used in laboratory experiments to determine if a signal could be detected and positively identified using standard water-quality monitoring equipment. The results were compared to historical data and baseline observations of water quality parameters, and showed that the chemicals tested do commonly affect water quality parameters. This work is an effort to demonstrate that hydrologic and water quality models may be applied to improve the placement of in-stream water quality monitoring devices. This information may increase the capability of early-warning systems to alert community health and environmental agencies of surface water spills associated with unconventional oil and gas operations.« less

  15. Re-engineering Nascom's network management architecture

    NASA Technical Reports Server (NTRS)

    Drake, Brian C.; Messent, David

    1994-01-01

    The development of Nascom systems for ground communications began in 1958 with Project Vanguard. The low-speed systems (rates less than 9.6 Kbs) were developed following existing standards; but, there were no comparable standards for high-speed systems. As a result, these systems were developed using custom protocols and custom hardware. Technology has made enormous strides since the ground support systems were implemented. Standards for computer equipment, software, and high-speed communications exist and the performance of current workstations exceeds that of the mainframes used in the development of the ground systems. Nascom is in the process of upgrading its ground support systems and providing additional services. The Message Switching System (MSS), Communications Address Processor (CAP), and Multiplexer/Demultiplexer (MDM) Automated Control System (MACS) are all examples of Nascom systems developed using standards such as, X-windows, Motif, and Simple Network Management Protocol (SNMP). Also, the Earth Observing System (EOS) Communications (Ecom) project is stressing standards as an integral part of its network. The move towards standards has produced a reduction in development, maintenance, and interoperability costs, while providing operational quality improvement. The Facility and Resource Manager (FARM) project has been established to integrate the Nascom networks and systems into a common network management architecture. The maximization of standards and implementation of computer automation in the architecture will lead to continued cost reductions and increased operational efficiency. The first step has been to derive overall Nascom requirements and identify the functionality common to all the current management systems. The identification of these common functions will enable the reuse of processes in the management architecture and promote increased use of automation throughout the Nascom network. The MSS, CAP, MACS, and Ecom projects have indicated the potential value of commercial-off-the-shelf (COTS) and standards through reduced cost and high quality. The FARM will allow the application of the lessons learned from these projects to all future Nascom systems.

  16. Classification of air quality using fuzzy synthetic multiplication.

    PubMed

    Abdullah, Lazim; Khalid, Noor Dalina

    2012-11-01

    Proper identification of environment's air quality based on limited observations is an essential task to meet the goals of environmental management. Various classification methods have been used to estimate the change of air quality status and health. However, discrepancies frequently arise from the lack of clear distinction between each air quality, the uncertainty in the quality criteria employed and the vagueness or fuzziness embedded in the decision-making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies when describing integrated air quality conditions with respect to various pollutants. Therefore, this paper presents two fuzzy multiplication synthetic techniques to establish classification of air quality. The fuzzy multiplication technique empowers the max-min operations in "or" and "and" in executing the fuzzy arithmetic operations. Based on a set of air pollutants data carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matter (PM(10)) collected from a network of 51 stations in Klang Valley, East Malaysia, Sabah, and Sarawak were utilized in this evaluation. The two fuzzy multiplication techniques consistently classified Malaysia's air quality as "good." The findings indicated that the techniques may have successfully harmonized inherent discrepancies and interpret complex conditions. It was demonstrated that fuzzy synthetic multiplication techniques are quite appropriate techniques for air quality management.

  17. External quality-assurance project report for the National Atmospheric Deposition Program/National Trends Network and Mercury Deposition Network, 2009-2010

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn; Rhodes, Mark F.; Chesney, Tanya A.

    2014-01-01

    The U.S. Geological Survey operated six distinct programs to provide external quality-assurance monitoring for the National Atmospheric Deposition Program/National Trends Network (NTN) and Mercury Deposition Network (MDN) during 2009–2010. The field-audit program assessed the effects of onsite exposure, sample handling, and shipping on the chemistry of NTN samples; a system-blank program assessed the same effects for MDN. Two interlaboratory-comparison programs assessed the bias and variability of the chemical analysis data from the Central Analytical Laboratory (CAL) and Mercury (Hg) Analytical Laboratory (HAL). The blind-audit program was also implemented for the MDN to evaluate analytical bias in total Hg concentration data produced by the HAL. The co-located-sampler program was used to identify and quantify potential shifts in NADP data resulting from replacement of original network instrumentation with new electronic recording rain gages (E-gages) and precipitation collectors that use optical sensors. The results indicate that NADP data continue to be of sufficient quality for the analysis of spatial distributions and time trends of chemical constituents in wet deposition across the United States. Results also suggest that retrofit of the NADP networks with the new precipitation collectors could cause –8 to +14 percent shifts in NADP annual precipitation-weighted mean concentrations and total deposition values for ammonium, nitrate, sulfate, and hydrogen ion, and larger shifts (+13 to +74 percent) for calcium, magnesium, sodium, potassium, and chloride. The prototype N-CON Systems bucket collector is more efficient in the catch of precipitation in winter than Aerochem Metrics Model 301 collector, especially for light snowfall.

  18. U.S. Geological Survey external quality-assurance project report for the National Atmospheric Deposition Program / National Trends Network and Mercury Deposition Network, 2011-2012

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn

    2014-01-01

    The U.S. Geological Survey operated six distinct programs to provide external quality-assurance monitoring for the National Atmospheric Deposition Program (NADP) / National Trends Network (NTN) and Mercury Deposition Network (MDN) during 2011–2012. The field-audit program assessed the effects of onsite exposure, sample handling, and shipping on the chemistry of NTN samples; a system-blank program assessed the same effects for MDN. Two interlaboratory-comparison programs assessed the bias and variability of the chemical analysis data from the Central Analytical Laboratory and Mercury Analytical Laboratory (HAL). A blind-audit program was implemented for the MDN during 2011 to evaluate analytical bias in HAL total mercury concentration data. The co-located–sampler program was used to identify and quantify potential shifts in NADP data resulting from the replacement of original network instrumentation with new electronic recording rain gages and precipitation collectors that use optical precipitation sensors. The results indicate that NADP data continue to be of sufficient quality for the analysis of spatial distributions and time trends of chemical constituents in wet deposition across the United States. Co-located rain gage results indicate -3.7 to +6.5 percent bias in NADP precipitation-depth measurements. Co-located collector results suggest that the retrofit of the NADP networks with the new precipitation collectors could cause +10 to +36 percent shifts in NADP annual deposition values for ammonium, nitrate, and sulfate; -7.5 to +41 percent shifts for hydrogen-ion deposition; and larger shifts (-51 to +52 percent) for calcium, magnesium, sodium, potassium, and chloride. The prototype N-CON Systems bucket collector typically catches more precipitation than the NADP-approved Aerochem Metrics Model 301 collector.

  19. Research on scheme of applying ASON to current networks

    NASA Astrophysics Data System (ADS)

    Mao, Y. F.; Li, J. R.; Deng, L. J.

    2008-10-01

    Automatically Switched Optical Network (ASON) is currently a new and hot research subject in the world. It can provide high bandwidth, high assembly flexibility, high network security and reliability, but with a low management cost. It is presented to meet the requirements for high-throughput optical access with stringent Quality of Service (QoS). But as a brand new technology, ASON can not be supported by the traditional protocol software and network equipments. And the approach to build a new ASON network on the basis of completely abandoning the traditional optical network facilities is not desirable, because it costs too much and wastes a lot of network resources can also be used. So how to apply ASON to the current networks and realize the smooth transition between the existing network and ASON has been a serious problem to many network operators. In this research, the status in quo of ASON is introduced first and then the key problems should be considered when applying ASON to current networks are discussed. Based on this, the strategies should be complied with to overcome these key problems are listed. At last, the approach to apply ASON to the current optical networks is proposed and analyzed.

  20. Dynamic Spectrum Access for Internet of Things Service in Cognitive Radio-Enabled LPWANs

    PubMed Central

    Moon, Bongkyo

    2017-01-01

    In this paper, we focus on a dynamic spectrum access strategy for Internet of Things (IoT) applications in two types of radio systems: cellular networks and cognitive radio-enabled low power wide area networks (CR-LPWANs). The spectrum channel contention between the licensed cellular networks and the unlicensed CR-LPWANs, which work with them, only takes place within the cellular radio spectrum range. Our aim is to maximize the spectrum capacity for the unlicensed users while ensuring that it never interferes with the licensed network. Therefore, in this paper we propose a dynamic spectrum access strategy for CR-LPWANs operating in both licensed and unlicensed bands. The simulation and the numerical analysis by using a matrix geometric approach for the strategy are presented. Finally, we obtain the blocking probability of the licensed users, the mean dwell time of the unlicensed user, and the total carried traffic and combined service quality for the licensed and unlicensed users. The results show that the proposed strategy can maximize the spectrum capacity for the unlicensed users using IoT applications as well as keep the service quality of the licensed users independent of them. PMID:29206215

  1. Dynamic Spectrum Access for Internet of Things Service in Cognitive Radio-Enabled LPWANs.

    PubMed

    Moon, Bongkyo

    2017-12-05

    In this paper, we focus on a dynamic spectrum access strategy for Internet of Things (IoT) applications in two types of radio systems: cellular networks and cognitive radio-enabled low power wide area networks (CR-LPWANs). The spectrum channel contention between the licensed cellular networks and the unlicensed CR-LPWANs, which work with them, only takes place within the cellular radio spectrum range. Our aim is to maximize the spectrum capacity for the unlicensed users while ensuring that it never interferes with the licensed network. Therefore, in this paper we propose a dynamic spectrum access strategy for CR-LPWANs operating in both licensed and unlicensed bands. The simulation and the numerical analysis by using a matrix geometric approach for the strategy are presented. Finally, we obtain the blocking probability of the licensed users, the mean dwell time of the unlicensed user, and the total carried traffic and combined service quality for the licensed and unlicensed users. The results show that the proposed strategy can maximize the spectrum capacity for the unlicensed users using IoT applications as well as keep the service quality of the licensed users independent of them.

  2. Joint Efforts Towards European HF Radar Integration

    NASA Astrophysics Data System (ADS)

    Rubio, A.; Mader, J.; Griffa, A.; Mantovani, C.; Corgnati, L.; Novellino, A.; Schulz-Stellenfleth, J.; Quentin, C.; Wyatt, L.; Ruiz, M. I.; Lorente, P.; Hartnett, M.; Gorringe, P.

    2016-12-01

    During the past two years, significant steps have been made in Europe for achieving the needed accessibility to High Frequency Radar (HFR) data for a pan-European use. Since 2015, EuroGOOS Ocean Observing Task Teams (TT), such as HFR TT, are operational networks of observing platforms. The main goal is on the harmonization of systems requirements, systems design, data quality, improvement and proof of the readiness and standardization of HFR data access and tools. Particular attention is being paid by HFR TT to converge from different projects and programs toward those common objectives. First, JERICO-NEXT (Joint European Research Infrastructure network for Coastal Observatory - Novel European eXpertise for coastal observaTories, H2020 2015 Programme) will contribute on describing the status of the European network, on seeking harmonization through exchange of best practices and standardization, on developing and giving access to quality control procedures and new products, and finally on demonstrating the use of such technology in the general scientific strategy focused by the Coastal Observatory. Then, EMODnet (European Marine Observation and Data Network) Physics started to assemble HF radar metadata and data products within Europe in a uniform way. This long term program is providing a combined array of services and functionalities to users for obtaining free of charge data, meta-data and data products on the physical conditions of European sea basins and oceans. Additionally, the Copernicus Marine Environment Monitoring Service (CMEMS) delivers from 2015 a core information service to any user related to 4 areas of benefits: Maritime Safety, Coastal and Marine Environment, Marine Resources, and Weather, Seasonal Forecasting and Climate activities. INCREASE (Innovation and Networking for the integration of Coastal Radars into EuropeAn marine SErvices - CMEMS Service Evolution 2016) will set the necessary developments towards the integration of existing European HFR operational systems into CMEMS. Finally, these current progresses will contribute to integrate HFR platforms as important operational components of EOOS, the European Ocean Observing System, designed to align and integrate Europe's ocean observing capacity for a truly integrated end-to-end ocean observing in Europe.

  3. Extending the ARS Experimental Watersheds to Address Regional Issues

    NASA Astrophysics Data System (ADS)

    Marks, D.; Goodrich, D. C.; Winstral, A.; Bosch, D. D.; Pool, D.

    2001-12-01

    The USDA-Agricultural Research Service's (ARS) Watershed Research Program maintains and operates a diverse, geog raphically distributed, nested, multi-scale, national ex perimental watershed network. This network, much of which has been operational for more than 40 years (several more than 60 years), constitutes one the best networks of its kind in the world. The watershed network and its instrumentation was primarily established to assess the hydrologic impacts of watershed conservation and management practices. It has evolved, through development of long-term hydrologic data, as a network of high quality outdoor laboratories for addressing emerging science issues facing hydrologists and resource managers. While the value of the experimental watershed for investigating precipitation, climatic, and hydrologic processes is unquestioned, extending the results from these investigations to other sites and larger areas is more difficult. ARS experimental watersheds are a few hundred km2 or smaller making it challenging to address regional scale issues. To address this the ARS watershed program is, with a suite of partners from universities and other federal agencies, enlarging its research focus to extend beyond the boundaries of the experimental watershed. In this poster we present several examples of this effort, with suggestions on how, using the experimental watershed and its core, a larger scale hydrologic observatory could be developed and maintained.

  4. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  5. A novel PON-based mobile distributed cluster of antennas approach to provide impartial and broadband services to end users

    NASA Astrophysics Data System (ADS)

    Sana, Ajaz; Saddawi, Samir; Moghaddassi, Jalil; Hussain, Shahab; Zaidi, Syed R.

    2010-01-01

    In this research paper we propose a novel Passive Optical Network (PON) based Mobile Worldwide Interoperability for Microwave Access (WiMAX) access network architecture to provide high capacity and performance multimedia services to mobile WiMAX users. Passive Optical Networks (PON) networks do not require powered equipment; hence they cost lower and need less network management. WiMAX technology emerges as a viable candidate for the last mile solution. In the conventional WiMAX access networks, the base stations and Multiple Input Multiple Output (MIMO) antennas are connected by point to point lines. Ideally in theory, the Maximum WiMAX bandwidth is assumed to be 70 Mbit/s over 31 miles. In reality, WiMAX can only provide one or the other as when operating over maximum range, bit error rate increases and therefore it is required to use lower bit rate. Lowering the range allows a device to operate at higher bit rates. Our focus in this research paper is to increase both range and bit rate by utilizing distributed cluster of MIMO antennas connected to WiMAX base stations with PON based topologies. A novel quality of service (QoS) algorithm is also proposed to provide admission control and scheduling to serve classified traffic. The proposed architecture presents flexible and scalable system design with different performance requirements and complexity.

  6. Improving the Quality of Service and Security of Military Networks with a Network Tasking Order Process

    DTIC Science & Technology

    2010-09-01

    IMPROVING THE QUALITY OF SERVICE AND SECURITY OF MILITARY NETWORKS WITH A NETWORK TASKING ORDER...United States. AFIT/DCS/ENG/10-09 IMPROVING THE QUALITY OF SERVICE AND SECURITY OF MILITARY NETWORKS WITH A NETWORK TASKING ORDER PROCESS...USAF September 2010 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED AFIT/DCS/ENG/10-09 IMPROVING THE QUALITY OF SERVICE AND

  7. A Statewide Private Microwave Wide Area Network for Real-time Natural Hazard Monitoring

    NASA Astrophysics Data System (ADS)

    Williams, M. C.; Kent, G.; Smith, K. D.; Plank, G.; Slater, D.; Torrisi, J.; Presser, R.; Straley, K.

    2013-12-01

    The Nevada Seismological Laboratory (NSL) at the University of Nevada, Reno, operates the Nevada Seismic Network, a collection of ground motion instruments installed throughout Nevada and California, for the purposes of detecting, locating, and notifying the public of earthquakes in the state. To perform these tasks effectively, NSL has designed and built a statewide wireless microwave wide-area network (WAN) in order to receive ground motion data in near real-time. This network consists of radio access points, backhauls, and backbone communication sites transmitting time-series, images, and datalogger diagnostics to our data center servers in Reno. This privately managed communication network greatly reduces the dependence on third-party infrastructure (e.g. commercial cellular networks), and is vital for emergency management response and system uptime. Any individual seismograph or data collection device is networked through a wireless point-to-multipoint connection to a remote access point (AP) using a low-cost radio/routerboard combination. Additional point-to-point connections from AP's to radio backhauls and/or mountaintop backbone sites allow the Data Center in Reno to communicate with and receive data directly from each datalogger. Dataloggers, radios, and routers can be configured using tablets on-site, or via desktop computers at the Data Center. Redundant mountaintop links can be added to the network and facilitate the re-routing of data (similar to a meshed network) in the event of a faulty, failing, or noisy communication site. All routers, radios, and servers, including those at the Data Center, have redundant power and can operate independently in the event of a grid power or public Internet outage. A managed server room at the Data Center processes earthquake data for notifications and acts as a data source for remote users. Consisting of about 500 hosts, and spanning hundreds of miles, this WAN provides network operators access to each router and datalogger in our seismic network not only for data collection, but also for maintenance and quality control. This has resulted in several partnerships with other agencies. In addition to our seismic station network for earthquake monitoring, we currently manage ~400 more channels of data (many running at 500 Hz) for the National Center for Nuclear Security (NCNS) Source Physics Experiments, a series of chemical explosions at the Nevada National Security Site. Some of our mountaintop stations have been experimentally equipped with near-infrared high-definition fire cameras for wildfire monitoring, and have recently recorded the Bison and Pedlar fires in northwest Nevada. Data for the Nevada EPSCor climate program also utilizes the NSL WAN. Real-time access to data for these experiments greatly reduces the effort required for data archival, quality control, and monitoring equipment failures. Future plans include increasing density of stations in urban areas such as Reno and Las Vegas, and expanding coverage to Tahoe and eastern Nevada.

  8. Hybrid architecture for building secure sensor networks

    NASA Astrophysics Data System (ADS)

    Owens, Ken R., Jr.; Watkins, Steve E.

    2012-04-01

    Sensor networks have various communication and security architectural concerns. Three approaches are defined to address these concerns for sensor networks. The first area is the utilization of new computing architectures that leverage embedded virtualization software on the sensor. Deploying a small, embedded virtualization operating system on the sensor nodes that is designed to communicate to low-cost cloud computing infrastructure in the network is the foundation to delivering low-cost, secure sensor networks. The second area focuses on securing the sensor. Sensor security components include developing an identification scheme, and leveraging authentication algorithms and protocols that address security assurance within the physical, communication network, and application layers. This function will primarily be accomplished through encrypting the communication channel and integrating sensor network firewall and intrusion detection/prevention components to the sensor network architecture. Hence, sensor networks will be able to maintain high levels of security. The third area addresses the real-time and high priority nature of the data that sensor networks collect. This function requires that a quality-of-service (QoS) definition and algorithm be developed for delivering the right data at the right time. A hybrid architecture is proposed that combines software and hardware features to handle network traffic with diverse QoS requirements.

  9. The PBO Nucleus: Integration of the Existing Continuous GPS Networks in the Western U.S.

    NASA Astrophysics Data System (ADS)

    Blume, F.; Anderson, G.; Freymueller, J. T.; Herring, T. A.; Melbourne, T. I.; Murray, M. H.; Prescott, W. H.; Smith, R. B.; Wernicke, B.

    2004-12-01

    Tectonic and earthquake research in the US has experienced a quiet revolution over the last decade precipitated by the recognition that slow-motion faulting events can both trigger and be triggered by regular earthquakes. Transient motion has now been found in essentially all tectonic environments, and the detection and analysis of such events is the first-order science target of the EarthScope Project. Because of this and a host of other fundamental tectonics questions that can be answered only with long-duration geodetic time series, the incipient 1400-station EarthScope Plate Boundary Observatory (PBO) network has been designed to leverage 432 existing continuous GPS stations whose measurements extend back over a decade. The irreplaceable recording history of these stations will accelerate EarthScope scientific return by providing the highest possible resolution. This resolution will be used to detect and understand transients, to determine the three-dimensional velocity field (particularly vertical motion), and to improve measurement precision by understanding the complex noise sources inherent in GPS. The PBO Nucleus Project is designed operate, maintain and upgrade a subset of six western U.S. geodetic networks: the Alaska Deformation Array (AKDA), Bay Area Regional Deformation network (BARD), the Basin and Range Geodetic Network (BARGEN), the Eastern Basin and Range/Yellowstone network (EBRY), the Pacific Northwest Geodetic Array (PANGA), and the Southern California Integrated Geodetic Network (SCIGN), until they are subsumed by PBO in 2008. Uninterrupted data flow from these stations will effectively double the time-series length of PBO over the expected life of EarthScope, and create, for the first time, a single GPS-based geodetic network in the US. Other existing sites will remain in operation under support from non-NSF sources (e.g. the USGS), and EarthScope will benefit from their continued operation. On the grounds of relevance to EarthScope science goals, geographic distribution and data quality, 209 of the 432 existing stations have been selected as the nucleus upon which to build PBO. We have begun converting these stations to a PBO-compatible mode of operation; data now flow directly to PBO archives and processing centers while maintenance, operations, and meta-data requirements are currently under upgrade to PBO standards.

  10. Field testing of a remote controlled robotic tele-echo system in an ambulance using broadband mobile communication technology.

    PubMed

    Takeuchi, Ryohei; Harada, Hiroshi; Masuda, Kohji; Ota, Gen-ichiro; Yokoi, Masaki; Teramura, Nobuyasu; Saito, Tomoyuki

    2008-06-01

    We report the testing of a mobile Robotic Tele-echo system that was placed in an ambulance and successfully transmitted clear real time echo imaging of a patient's abdomen to the destination hospital from where this device was being remotely operated. Two-way communication between the paramedics in this vehicle and a doctor standing by at the hospital was undertaken. The robot was equipped with an ultrasound probe which was remotely controlled by the clinician at the hospital and ultrasound images of the patient were transmitted wirelessly. The quality of the ultrasound images that were transmitted over the public mobile telephone networks and those transmitted over the Multimedia Wireless Access Network (a private networks) were compared. The transmission rate over the public networks and the private networks was approximately 256 Kbps, 3 Mbps respectively. Our results indicate that ultrasound images of far higher definition could be obtained through the private networks.

  11. Performance study of the application of Artificial Neural Networks to the completion and prediction of data retrieved by underwater sensors.

    PubMed

    Baladrón, Carlos; Aguiar, Javier M; Calavia, Lorena; Carro, Belén; Sánchez-Esguevillas, Antonio; Hernández, Luis

    2012-01-01

    This paper presents a proposal for an Artificial Neural Network (ANN)-based architecture for completion and prediction of data retrieved by underwater sensors. Due to the specific conditions under which these sensors operate, it is not uncommon for them to fail, and maintenance operations are difficult and costly. Therefore, completion and prediction of the missing data can greatly improve the quality of the underwater datasets. A performance study using real data is presented to validate the approach, concluding that the proposed architecture is able to provide very low errors. The numbers show as well that the solution is especially suitable for cases where large portions of data are missing, while in situations where the missing values are isolated the improvement over other simple interpolation methods is limited.

  12. A Low-Complexity Subgroup Formation with QoS-Aware for Enhancing Multicast Services in LTE Networks

    NASA Astrophysics Data System (ADS)

    Algharem, M.; Omar, M. H.; Rahmat, R. F.; Budiarto, R.

    2018-03-01

    The high demand of Multimedia services on in Long Term Evolution (LTE) and beyond networks forces the networks operators to find a solution that can handle the huge traffic. Along with this, subgroup formation techniques are introduced to overcome the limitations of the Conventional Multicast Scheme (CMS) by splitting the multicast users into several subgroups based on the users’ channels quality signal. However, finding the best subgroup configuration with low complexity is need more investigations. In this paper, an efficient and simple subgroup formation mechanisms are proposed. The proposed mechanisms take the transmitter MAC queue in account. The effectiveness of the proposed mechanisms is evaluated and compared with CMS in terms of throughput, fairness, delay, Block Error Rate (BLER).

  13. Bias and precision of selected analytes reported by the National Atmospheric Deposition Program and National Trends Network, 1984

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Willoughby, T.C.

    1987-01-01

    The U.S. Geological Survey operated a blind audit sample program during 1974 to test the effects of the sample handling and shipping procedures used by the National Atmospheric Deposition Program and National Trends Network on the quality of wet deposition data produced by the combined networks. Blind audit samples, which were dilutions of standard reference water samples, were submitted by network site operators to the central analytical laboratory disguised as actual wet deposition samples. Results from the analyses of blind audit samples were used to calculate estimates of analyte bias associated with all network wet deposition samples analyzed in 1984 and to estimate analyte precision. Concentration differences between double blind samples that were submitted to the central analytical laboratory and separate analyses of aliquots of those blind audit samples that had not undergone network sample handling and shipping were used to calculate analyte masses that apparently were added to each blind audit sample by routine network handling and shipping procedures. These calculated masses indicated statistically significant biases for magnesium, sodium , potassium, chloride, and sulfate. Median calculated masses were 41.4 micrograms (ug) for calcium, 14.9 ug for magnesium, 23.3 ug for sodium, 0.7 ug for potassium, 16.5 ug for chloride and 55.3 ug for sulfate. Analyte precision was estimated using two different sets of replicate measures performed by the central analytical laboratory. Estimated standard deviations were similar to those previously reported. (Author 's abstract)

  14. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  15. Quantifying the quality of precipitation data from different sources

    NASA Astrophysics Data System (ADS)

    Leijnse, Hidde; Wauben, Wiel; Overeem, Aart; de Haij, Marijn

    2015-04-01

    There is an increasing demand for high-resolution rainfall data. The current manual and automatic networks of climate and meteorological stations provide high quality rainfall data, but they cannot provide the high spatial and temporal resolution required for many applications. This can only partly be solved by using remotely sensed data. It is therefore necessary to consider third-party data, such as rain gauges operated by amateurs and rainfall intensities from commercial cellular communication links. The quality of such third-party data is highly variable and generally lower than that of dedicated networks. Often, such data quality information is missing for third party data. In order to be able to use data from various sources it is vital that quantitative knowledge of the data quality is available. This holds for all data sources, including the rain gauges in the reference networks of climate and meteorological stations. Data quality information is generally either not available or very limited for third-party data sources. For most dedicated climate meteorological networks, this information is only available for the sensor in laboratory conditions. In many cases, however, a significant part of the measurement errors and uncertainties is determined by the siting and maintenance of the sensor, for which generally only qualitative information is available. Furthermore sensors may have limitations under specific conditions. We aim to quantify data quality for different data sources by performing analyses on collocated data sets. Here we present an intercomparison of two years of precipitation data from six different sources (manual rain gauge, automatic rain gauge, present weather sensor, weather radar, commercial cellular communication links, and Meteosat) at three different locations in the Netherlands. We use auxiliary meteorological data to determine if the quality is influenced by other variables (e.g. the temperature influencing the evaporation from the rain gauge). We use three techniques to compare the data sets: 1) direct comparison; 2) triple collocation (see Stoffelen, 1998); and 3) comparison of statistics. Stoffelen, A. (1998). Toward the true near-surface wind speed: Error modeling and calibration using triple collocation. Journal of Geophysical Research: Oceans (1978-2012), 103(C4), 7755-7766.

  16. Quadrennial Defense Review Report

    DTIC Science & Technology

    2010-02-01

    medicine , and computer network operations. While we continue to employ a mix of programs and incentives to recruit quality personnel, we are also...Lithuania* Singapore Australia Finland Luxembourg* Slovakia* Austria France* Macedonia Slovenia* Azerbaijan Georgia Montenegro Spain* Belgium...20,000 positions by 2015. We will continue to significantly enhance Secretary of Defense Robert M. Gates meets with plant workers during a tour of an

  17. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)

    NASA Technical Reports Server (NTRS)

    Benson, Markland

    2008-01-01

    The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.

  18. Expert systems and advanced automation for space missions operations

    NASA Technical Reports Server (NTRS)

    Durrani, Sajjad H.; Perkins, Dorothy C.; Carlton, P. Douglas

    1990-01-01

    Increased complexity of space missions during the 1980s led to the introduction of expert systems and advanced automation techniques in mission operations. This paper describes several technologies in operational use or under development at the National Aeronautics and Space Administration's Goddard Space Flight Center. Several expert systems are described that diagnose faults, analyze spacecraft operations and onboard subsystem performance (in conjunction with neural networks), and perform data quality and data accounting functions. The design of customized user interfaces is discussed, with examples of their application to space missions. Displays, which allow mission operators to see the spacecraft position, orientation, and configuration under a variety of operating conditions, are described. Automated systems for scheduling are discussed, and a testbed that allows tests and demonstrations of the associated architectures, interface protocols, and operations concepts is described. Lessons learned are summarized.

  19. Implementability of two-qubit unitary operations over the butterfly network and the ladder network with free classical communication

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akibue, Seiseki; Murao, Mio

    2014-12-04

    We investigate distributed implementation of two-qubit unitary operations over two primitive networks, the butterfly network and the ladder network, as a first step to apply network coding for quantum computation. By classifying two-qubit unitary operations in terms of the Kraus-Cirac number, the number of non-zero parameters describing the global part of two-qubit unitary operations, we analyze which class of two-qubit unitary operations is implementable over these networks with free classical communication. For the butterfly network, we show that two classes of two-qubit unitary operations, which contain all Clifford, controlled-unitary and matchgate operations, are implementable over the network. For the laddermore » network, we show that two-qubit unitary operations are implementable over the network if and only if their Kraus-Cirac number do not exceed the number of the bridges of the ladder.« less

  20. Latest developments in advanced network management and cross-sharing of next-generation flux stations

    NASA Astrophysics Data System (ADS)

    Burba, George; Johnson, Dave; Velgersdyk, Michael; Begashaw, Israel; Allyn, Douglas

    2016-04-01

    In recent years, spatial and temporal flux data coverage improved significantly and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of the data collection, and better handling of the extensive amounts of generated data. However, operating budgets for flux research items, such as labor, travel, and hardware, are becoming more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process, including sharing data among collaborative groups. On one hand, such tools can maximize time dedicated to publications answering research questions, and minimize time and expenses spent on data acquisition, processing, quality control and overall station management. On the other hand, cross-sharing the stations with external collaborators may help leverage available funding, and promote data analyses and publications. A new low-cost, advanced system, FluxSuite, utilizes a combination of hardware, software and web-services to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: (i) The system can be easily incorporated into a new flux station, or as un upgrade to many presently operating flux stations, via weatherized remotely-accessible microcomputer, SmartFlux 2, with fully digital inputs (ii) Each next-generation station will measure all parameters needed for flux computations in a digital and PTP time-synchronized mode, accepting digital signals from a number of anemometers and data loggers (iii) The field microcomputer will calculate final fully-processed flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. (iv) Final fluxes, radiation, weather and soil data will be merged into a single quality-control file (v) Multiple flux stations can be linked into an automated time-synchronized network (vi) Flux network managers, or PI's, can see all stations in real-time, including fluxes, supporting data, automated reports, and email alerts (vii) PI's can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions (viii) Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from different actual networks This presentation provides detailed examples of FluxSuite currently utilized to manage two large flux networks in China (National Academy of Sciences and Agricultural Academy of Sciences), and smaller networks with stations in the USA, Germany, Ireland, Malaysia and other locations around the globe. Very latest 2016 developments and expanded functionality are also discussed.

  1. The EarthScope Plate Boundary Observatory and allied networks, the makings of nascent Earthquake and Tsunami Early Warning System in Western North America.

    NASA Astrophysics Data System (ADS)

    Mattioli, Glen; Mencin, David; Hodgkinson, Kathleen; Meertens, Charles; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl

    2016-04-01

    The NSF-funded GAGE Facility, managed by UNAVCO, operates approximately ~1300 GNSS stations distributed across North and Central America and in the circum-Caribbean. Following community input starting in 2011 from several workshops and associated reports,UNAVCO has been exploring ways to increase the capability and utility of the geodetic resources under its management to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources. Networks operated by UNAVCO for the NSF have the potential to profoundly transform our ability to rapidly characterize events, provide rapid characterization and warning, as well as improve hazard mitigation and response. Specific applications currently under development include earthquake early warning, tsunami early warning, and tropospheric modeling with university, commercial, non-profit and government partners on national and international scales. In the case of tsunami early warning, for example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation, which leads to the initial model of ocean forcing and tsunami generation. In addition, terrestrial GNSScan provide direct measurements of the tsunami through the associated traveling ionospheric disturbance from several 100's of km away as they approach the shoreline,which can be used to refine tsunami inundation models. Any operational system like this has multiple communities that rely on a pan-Pacific real-time open data set. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Combining existing data sets and user communities, for example seismic data and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to metadata, appropriate data formats, data quality assessment in real-time and other issues related to using these products operational forecasting. While progress has been made toward more open and free data access across national borders and toward more cooperation among cognizant government sanctioned "early warning" agencies, some impediments remain making a truly operational system a work in progress. Accordingly, UNAVCO has embarked on significant improvements and improvement goals to the original infrastructure and scope of the PBO. We anticipate that PBO and related networks will form a backbone for these disparate efforts providing high quality, low latency raw and processed GNSS data. This requires substantial upgrades to the entire system from the basic GNNS receiver, through robust data collection, archiving and open distribution mechanisms, to efficient data-processing strategies. UNAVCO is currently in a partnership with the commercial and scientific stakeholders to define, develop and deploy all segments of this improved geodetic network. We present the overarching goals, and current and planned future stateof this international resource.

  2. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing Archival/Methodology, and Results

    NASA Technical Reports Server (NTRS)

    Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.

    2003-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.

  3. The Central and Eastern U.S. Seismic Network: Legacy of USArray

    NASA Astrophysics Data System (ADS)

    Eakins, J. A.; Astiz, L.; Benz, H.; Busby, R. W.; Hafner, K.; Reyes, J. C.; Sharer, G.; Vernon, F.; Woodward, R.

    2014-12-01

    As the USArray Transportable Array entered the central and eastern United States, several Federal agencies (National Science Foundation, U.S. Geological Survey, U.S. Nuclear Regulatory Commission, and Department of Energy) recognized the unique opportunity to retain TA stations beyond the original timeline. The mission of the CEUSN is to produce data that enables researchers and Federal agencies alike to better understand the basic geologic questions, background earthquake rates and distribution, seismic hazard potential, and associated societal risks of this region. The selected long-term sub-array from Transportable Array (TA) stations includes nearly 200 sites, complemented by 100 broadband stations from the existing regional seismic networks to form the Central and Eastern United States Network (CEUSN). Multiple criteria for site selection were weighed by an inter-agency TA Station Selection (TASS) Working Group: seismic noise characteristics, data availability in real time, proximity to nuclear power plants, and homogeneous distribution throughout the region. The Array Network Facility (ANF) started collecting data for CEUSN network stations since late 2013, with all stations collected since May 2014. Regional seismic data streams are collected in real-time from the IRIS Data Management Center (DMC). TA stations selected to be part of CEUSN, retain the broadband sensor to which a 100 sps channel is added, the infrasound and environmental channels, and, at some stations, accelerometers are deployed. The upgraded sites become part of the N4 network for which ANF provides metadata and can issue remote commands to the station equipment. Stations still operated by TA, but planned for CEUSN, are included in the virtual network so all stations are currently available now. By the end of 2015, the remaining TA stations will be upgraded. Data quality control procedures developed for TA stations at ANF and at the DMC are currently performed on N4 data. However, teleseismic and regional events are only picked a few times a month to fulfill data quality checks on the data. The assembled CEUSN data sets can be requested from the DMC with the _CEUSN virtual network code. Acknowledgments to Seismic Regional Network Operators: C. Ammon, J. Ebel, D. Doser, R. Hermann, A. Holland, W-Y. Kim, C. Langston, T. Owens, and M. Withers.

  4. Australia's TERN: Building, Sustaining and Advancing Collaborative Long Term Ecosystem Research Networks

    NASA Astrophysics Data System (ADS)

    HEld, A. A.; Phinn, S. R.

    2012-12-01

    TERN is Australia's Terrestrial Ecosystem Research Network (www.tern.org.au) is one of several environmental data collection, storage and sharing projects developed through the government's research infrastructure programs 2008-2014. This includes terrestrial and coastal ecosystem data collection infrastructure across multiple disciplines, hardware, software and processes used to store, analyse and integrate data sets. TERN's overall objective is to build the collaborations, infrastructure and programs to meet the needs of ecosystem science communities in Australia in the long term, through institutional frameworks necessary to establish a national terrestrial ecosystem site and observational network, coordinated networks enabling cooperation and operational experience; public access to quality assured and appropriately licensed data; and allowing the terrestrial ecosystem research community to define and sustain the terrestrial observing paradigm into the longer term. This paper explains how TERN was originally established, and now operates, along with plans to sustain itself in the future. TERN is implemented through discipline/technical groups referred to as "TERN Facilities". Combined, the facilities provide observations of surface mass and energy fluxes over key ecosystems, biophysical remote sensing data, ecological survey plots, soils information, and coastal ecosystems and associated water quality variables across Australia. Additional integrative facilities cover elements of ecoinformatics, data-scaling and modelling, and linking science to management. A central coordination and portal facility provides meta-data storage, data identification, legal and licensing support. Data access, uploading, meta-data generation, DOI attachment and licensing is completed at each facility's own portal level. TERN also acts as the open-data repository of choice for Australian scientists required to publish their data. Several key lessons we have learnt, will be presented during the talk.

  5. Data quality control and tools in passive seismic experiments exemplified on the Czech broadband seismic pool MOBNET in the AlpArray collaborative project

    NASA Astrophysics Data System (ADS)

    Vecsey, Luděk; Plomerová, Jaroslava; Jedlička, Petr; Munzarová, Helena; Babuška, Vladislav; AlpArray Working Group

    2017-12-01

    This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.

  6. A conceptual ground-water-quality monitoring network for San Fernando Valley, California

    USGS Publications Warehouse

    Setmire, J.G.

    1985-01-01

    A conceptual groundwater-quality monitoring network was developed for San Fernando Valley to provide the California State Water Resources Control Board with an integrated, basinwide control system to monitor the quality of groundwater. The geology, occurrence and movement of groundwater, land use, background water quality, and potential sources of pollution were described and then considered in designing the conceptual monitoring network. The network was designed to monitor major known and potential point and nonpoint sources of groundwater contamination over time. The network is composed of 291 sites where wells are needed to define the groundwater quality. The ideal network includes four specific-purpose networks to monitor (1) ambient water quality, (2) nonpoint sources of pollution, (3) point sources of pollution, and (4) line sources of pollution. (USGS)

  7. Microseismic Monitoring Using Sparse Surface Network of Broadband Instruments: Western Canada Shale Play Case Study

    NASA Astrophysics Data System (ADS)

    Yenier, E.; Baturan, D.; Karimi, S.

    2016-12-01

    Monitoring of seismicity related to oil and gas operations is routinely performed nowadays using a number of different surface and downhole seismic array configurations and technologies. Here, we provide a hydraulic fracture (HF) monitoring case study that compares the data set generated by a sparse local surface network of broadband seismometers to a data set generated by a single downhole geophone string. Our data was collected during a 5-day single-well HF operation, by a temporary surface network consisting of 10 stations deployed within 5 km of the production well. The downhole data was recorded by a 20 geophone string deployed in an observation well located 15 m from the production well. Surface network data processing included standard STA/LTA event triggering enhanced by template-matching subspace detection, grid search locations which was improved using the double-differencing re-location technique, as well as Richter (ML) and moment (Mw) magnitude computations for all detected events. In addition, moment tensors were computed from first motion polarities and amplitudes for the subset of highest SNR events. The resulting surface event catalog shows a very weak spatio-temporal correlation to HF operations with only 43% of recorded seismicity occurring during HF stages times. This along with source mechanisms shows that the surface-recorded seismicity delineates the activation of several pre-existing structures striking NNE-SSW and consistent with regional stress conditions as indicated by the orientation of SHmax. Comparison of the sparse-surface and single downhole string datasets allows us to perform a cost-benefit analysis of the two monitoring methods. Our findings show that although the downhole array recorded ten times as many events, the surface network provides a more coherent delineation of the underlying structure and more accurate magnitudes for larger magnitude events. We attribute this to the enhanced focal coverage provided by the surface network and the use of broadband instrumentation. The results indicate that sparse surface networks of high quality instruments can provide rich and reliable datasets for evaluation of the impact and effectiveness of hydraulic fracture operations in regions with favorable surface noise, local stress and attenuation characteristics.

  8. A near-optimum procedure for selecting stations in a streamgaging network

    USGS Publications Warehouse

    Lanfear, Kenneth J.

    2005-01-01

    Two questions are fundamental to Federal government goals for a network of streamgages which are operated by the U.S. Geological Survey: (1) how well does the present network of streamagaging stations meet defined Federal goals and (2) what is the optimum set of stations to add or reactivate to support remaining goals? The solution involves an incremental-stepping procedure that is based on Basic Feasible Incremental Solutions (BFIS?s) where each BFIS satisfies at least one Federal streamgaging goal. A set of minimum Federal goals for streamgaging is defined to include water measurements for legal compacts and decrees, flooding, water budgets, regionalization of streamflow characteristics, and water quality. Fully satisfying all these goals by using the assumptions outlined in this paper would require adding 887 new streamgaging stations to the U.S. Geological Survey network and reactivating an additional 857 stations that are currently inactive.

  9. Enabling Optical Network Test Bed for 5G Tests

    NASA Astrophysics Data System (ADS)

    Giuntini, Marco; Grazioso, Paolo; Matera, Francesco; Valenti, Alessandro; Attanasio, Vincenzo; Di Bartolo, Silvia; Nastri, Emanuele

    2017-03-01

    In this work, we show some experimental approaches concerning optical network design dedicated to 5G infrastructures. In particular, we show some implementations of network slicing based on Carrier Ethernet forwarding, which will be very suitable in the context of 5G heterogeneous networks, especially looking at services for vertical enterprises. We also show how to adopt a central unit (orchestrator) to automatically manage such logical paths according to quality-of-service requirements, which can be monitored at the user location. We also illustrate how novel all-optical processes, such as the ones based on all-optical wavelength conversion, can be used for multicasting, enabling development of TV broadcasting based on 4G-5G terminals. These managing and forwarding techniques, operating on optical links, are tested in a wireless environment on Wi-Fi cells and emulating LTE and WiMAX systems by means of the NS-3 code.

  10. Development of a protocol to optimize electric power consumption and life cycle environmental impacts for operation of wastewater treatment plant.

    PubMed

    Piao, Wenhua; Kim, Changwon; Cho, Sunja; Kim, Hyosoo; Kim, Minsoo; Kim, Yejin

    2016-12-01

    In wastewater treatment plants (WWTPs), the portion of operating costs related to electric power consumption is increasing. If the electric power consumption decreased, however, it would be difficult to comply with the effluent water quality requirements. A protocol was proposed to minimize the environmental impacts as well as to optimize the electric power consumption under the conditions needed to meet the effluent water quality standards in this study. This protocol was comprised of six phases of procedure and was tested using operating data from S-WWTP to prove its applicability. The 11 major operating variables were categorized into three groups using principal component analysis and K-mean cluster analysis. Life cycle assessment (LCA) was conducted for each group to deduce the optimal operating conditions for each operating state. Then, employing mathematical modeling, six improvement plans to reduce electric power consumption were deduced. The electric power consumptions for suggested plans were estimated using an artificial neural network. This was followed by a second round of LCA conducted on the plans. As a result, a set of optimized improvement plans were derived for each group that were able to optimize the electric power consumption and life cycle environmental impact, at the same time. Based on these test results, the WWTP operating management protocol presented in this study is deemed able to suggest optimal operating conditions under which power consumption can be optimized with minimal life cycle environmental impact, while allowing the plant to meet water quality requirements.

  11. A time-varying subjective quality model for mobile streaming videos with stalling events

    NASA Astrophysics Data System (ADS)

    Ghadiyaram, Deepti; Pan, Janice; Bovik, Alan C.

    2015-09-01

    Over-the-top mobile video streaming is invariably influenced by volatile network conditions which cause playback interruptions (stalling events), thereby impairing users' quality of experience (QoE). Developing models that can accurately predict users' QoE could enable the more efficient design of quality-control protocols for video streaming networks that reduce network operational costs while still delivering high-quality video content to the customers. Existing objective models that predict QoE are based on global video features, such as the number of stall events and their lengths, and are trained and validated on a small pool of ad hoc video datasets, most of which are not publicly available. The model we propose in this work goes beyond previous models as it also accounts for the fundamental effect that a viewer's recent level of satisfaction or dissatisfaction has on their overall viewing experience. In other words, the proposed model accounts for and adapts to the recency, or hysteresis effect caused by a stall event in addition to accounting for the lengths, frequency of occurrence, and the positions of stall events - factors that interact in a complex way to affect a user's QoE. On the recently introduced LIVE-Avvasi Mobile Video Database, which consists of 180 distorted videos of varied content that are afflicted solely with over 25 unique realistic stalling events, we trained and validated our model to accurately predict the QoE, attaining standout QoE prediction performance.

  12. On the definition of adapted audio/video profiles for high-quality video calling services over LTE/4G

    NASA Astrophysics Data System (ADS)

    Ndiaye, Maty; Quinquis, Catherine; Larabi, Mohamed Chaker; Le Lay, Gwenael; Saadane, Hakim; Perrine, Clency

    2014-01-01

    During the last decade, the important advances and widespread availability of mobile technology (operating systems, GPUs, terminal resolution and so on) have encouraged a fast development of voice and video services like video-calling. While multimedia services have largely grown on mobile devices, the generated increase of data consumption is leading to the saturation of mobile networks. In order to provide data with high bit-rates and maintain performance as close as possible to traditional networks, the 3GPP (The 3rd Generation Partnership Project) worked on a high performance standard for mobile called Long Term Evolution (LTE). In this paper, we aim at expressing recommendations related to audio and video media profiles (selection of audio and video codecs, bit-rates, frame-rates, audio and video formats) for a typical video-calling services held over LTE/4G mobile networks. These profiles are defined according to targeted devices (smartphones, tablets), so as to ensure the best possible quality of experience (QoE). Obtained results indicate that for a CIF format (352 x 288 pixels) which is usually used for smartphones, the VP8 codec provides a better image quality than the H.264 codec for low bitrates (from 128 to 384 kbps). However sequences with high motion, H.264 in slow mode is preferred. Regarding audio, better results are globally achieved using wideband codecs offering good quality except for opus codec (at 12.2 kbps).

  13. [Strategies and development of quality assurance and control in the ELSA-Brasil].

    PubMed

    Schmidt, Maria Inês; Griep, Rosane Härter; Passos, Valéria Maria; Luft, Vivian Cristine; Goulart, Alessandra Carvalho; Menezes, Greice Maria de Souza; Molina, Maria del Carmen Bisi; Vigo, Alvaro; Nunes, Maria Angélica

    2013-06-01

    The ELSA-Brasil (Estudo Longitudinal de Saúde do Adulto - Brazilian Longitudinal Study for Adult Health) is a cohort study composed of 15,105 adults followed up in order to assess the development of chronic diseases, especially diabetes and cardiovascular disease. Its size, multicenter nature and the diversity of measurements required effective and efficient mechanisms of quality assurance and control. The main quality assurance activities (those developed before data collection) were: careful selection of research instruments, centralized training and certification, pretesting and pilot studies, and preparation of operation manuals for the procedures. Quality control activities (developed during data collection and processing) were performed more intensively at the beginning, when routines had not been established yet. The main quality control activities were: periodic observation of technicians, test-retest studies, data monitoring, network of supervisors, and cross visits. Data that estimate the reliability of the obtained information attest that the quality goals have been achieved.

  14. Profile and Remote Sensing Observation Datasets (Trace Gases and Aerosols) for Regional- Scale Model Evaluation under the Air Quality Model Evaluation International Initiative (AQMEII)- North American and European Perspectives

    EPA Science Inventory

    While the vast majority of operational air-pollution networks across the world are designed to measure relevant metrics at the surface, the air pollution problem is a three-dimensional phenomenon. The lack of adequate observations aloft to routinely characterize the nature of ai...

  15. Status of NGS CORS Network and Its Contribution to the GGOS Infrastructure

    NASA Astrophysics Data System (ADS)

    Choi, K. K.; Haw, D.; Sun, L.

    2017-12-01

    Recent advancement of Satellite Geodesy techniques can now contribute to the global frame realization needed to improve worldwide accuracies. These techniques rely on coordinates computed using continuously observed GPS data and corresponding satellite orbits. The GPS-based reference system continues to depend on the physical stability of a ground-based network of points as the primary foundation for these observations. NOAA's National Geodetic Survey (NGS) has been operating Continuously Operating Reference Stations (CORS) to provide direct access to the National Spatial Reference System (NSRS). By virtue of NGS' scientific reputation and leadership in national and international geospatial issues, NGS has determined to increase its participation in the maintenance of the U.S. component of the global GPS tracking network in order to realize a long-term stable national terrestrial reference frame. NGS can do so by leveraging its national leadership role coupled with NGS' scientific expertise, in designating and upgrading a subset of the current tracking network for this purpose. This subset of stations must have the highest operational standards to serve the dual functions: being the U.S. contribution to the international frame, along with providing the link to the national datum. These stations deserve special attention to ensure that the highest possible levels of quality and stability are maintained. To meet this need, NGS is working with the international scientific groups to add and designate these reference stations based on scientific merit such as: colocation with other geodetic techniques, geographic area, and monumentation stability.

  16. Ultra-low power wireless sensing for long-term structural health monitoring

    NASA Astrophysics Data System (ADS)

    Bilbao, Argenis; Hoover, Davis; Rice, Jennifer; Chapman, Jamie

    2011-04-01

    Researchers have made significant progress in recent years towards realizing long-term structural health monitoring (SHM) utilizing wireless smart sensor networks (WSSNs). These efforts have focused on improving the performance and robustness of such networks to achieve high quality data acquisition and in-network processing. One of the primary challenges still facing the use of smart sensors for long-term monitoring deployments is their limited power resources. Periodically accessing the sensor nodes to change batteries is not feasible or economical in many deployment cases. While energy harvesting techniques show promise for prolonging unattended network life, low-power design and operation are still critically important. This research presents a new, fully integrated ultra-low power wireless smart sensor node and a flexible base station, both designed for long-term SHM applications. The power consumption of the sensor nodes and base station has been minimized through careful hardware selection and the implementation of power-aware network software, without sacrificing flexibility and functionality.

  17. Implementation of Cyber-Physical Production Systems for Quality Prediction and Operation Control in Metal Casting.

    PubMed

    Lee, JuneHyuck; Noh, Sang Do; Kim, Hyun-Jung; Kang, Yong-Shin

    2018-05-04

    The prediction of internal defects of metal casting immediately after the casting process saves unnecessary time and money by reducing the amount of inputs into the next stage, such as the machining process, and enables flexible scheduling. Cyber-physical production systems (CPPS) perfectly fulfill the aforementioned requirements. This study deals with the implementation of CPPS in a real factory to predict the quality of metal casting and operation control. First, a CPPS architecture framework for quality prediction and operation control in metal-casting production was designed. The framework describes collaboration among internet of things (IoT), artificial intelligence, simulations, manufacturing execution systems, and advanced planning and scheduling systems. Subsequently, the implementation of the CPPS in actual plants is described. Temperature is a major factor that affects casting quality, and thus, temperature sensors and IoT communication devices were attached to casting machines. The well-known NoSQL database, HBase and the high-speed processing/analysis tool, Spark, are used for IoT repository and data pre-processing, respectively. Many machine learning algorithms such as decision tree, random forest, artificial neural network, and support vector machine were used for quality prediction and compared with R software. Finally, the operation of the entire system is demonstrated through a CPPS dashboard. In an era in which most CPPS-related studies are conducted on high-level abstract models, this study describes more specific architectural frameworks, use cases, usable software, and analytical methodologies. In addition, this study verifies the usefulness of CPPS by estimating quantitative effects. This is expected to contribute to the proliferation of CPPS in the industry.

  18. Corelli: a peer-to-peer dynamic replication service for supporting latency-dependent content in community networks

    NASA Astrophysics Data System (ADS)

    Tyson, Gareth; Mauthe, Andreas U.; Kaune, Sebastian; Mu, Mu; Plagemann, Thomas

    2009-01-01

    The quality of service for latency dependent content, such as video streaming, largely depends on the distance and available bandwidth between the consumer and the content. Poor provision of these qualities results in reduced user experience and increased overhead. To alleviate this, many systems operate caching and replication, utilising dedicated resources to move the content closer to the consumer. Latency-dependent content creates particular issues for community networks, which often display the property of strong internal connectivity yet poor external connectivity. However, unlike traditional networks, communities often cannot deploy dedicated infrastructure for both monetary and practical reasons. To address these issues, this paper proposes Corelli, a peer-to-peer replication infrastructure designed for use in community networks. In Corelli, high capacity peers in communities autonomously build a distributed cache to dynamically pre-fetch content early on in its popularity lifecycle. By exploiting the natural proximity of peers in the community, users can gain extremely low latency access to content whilst reducing egress utilisation. Through simulation, it is shown that Corelli considerably increases accessibility and improves performance for latency dependent content. Further, Corelli is shown to offer adaptive and resilient mechanisms that ensure that it can respond to variations in churn, demand and popularity.

  19. Reliable Geographical Forwarding in Cognitive Radio Sensor Networks Using Virtual Clusters

    PubMed Central

    Zubair, Suleiman; Fisal, Norsheila

    2014-01-01

    The need for implementing reliable data transfer in resource-constrained cognitive radio ad hoc networks is still an open issue in the research community. Although geographical forwarding schemes are characterized by their low overhead and efficiency in reliable data transfer in traditional wireless sensor network, this potential is still yet to be utilized for viable routing options in resource-constrained cognitive radio ad hoc networks in the presence of lossy links. In this paper, a novel geographical forwarding technique that does not restrict the choice of the next hop to the nodes in the selected route is presented. This is achieved by the creation of virtual clusters based on spectrum correlation from which the next hop choice is made based on link quality. The design maximizes the use of idle listening and receiver contention prioritization for energy efficiency, the avoidance of routing hot spots and stability. The validation result, which closely follows the simulation result, shows that the developed scheme can make more advancement to the sink as against the usual decisions of relevant ad hoc on-demand distance vector route select operations, while ensuring channel quality. Further simulation results have shown the enhanced reliability, lower latency and energy efficiency of the presented scheme. PMID:24854362

  20. On the Optimization of a Probabilistic Data Aggregation Framework for Energy Efficiency in Wireless Sensor Networks.

    PubMed

    Kafetzoglou, Stella; Aristomenopoulos, Giorgos; Papavassiliou, Symeon

    2015-08-11

    Among the key aspects of the Internet of Things (IoT) is the integration of heterogeneous sensors in a distributed system that performs actions on the physical world based on environmental information gathered by sensors and application-related constraints and requirements. Numerous applications of Wireless Sensor Networks (WSNs) have appeared in various fields, from environmental monitoring, to tactical fields, and healthcare at home, promising to change our quality of life and facilitating the vision of sensor network enabled smart cities. Given the enormous requirements that emerge in such a setting-both in terms of data and energy-data aggregation appears as a key element in reducing the amount of traffic in wireless sensor networks and achieving energy conservation. Probabilistic frameworks have been introduced as operational efficient and performance effective solutions for data aggregation in distributed sensor networks. In this work, we introduce an overall optimization approach that improves and complements such frameworks towards identifying the optimal probability for a node to aggregate packets as well as the optimal aggregation period that a node should wait for performing aggregation, so as to minimize the overall energy consumption, while satisfying certain imposed delay constraints. Primal dual decomposition is employed to solve the corresponding optimization problem while simulation results demonstrate the operational efficiency of the proposed approach under different traffic and topology scenarios.

  1. Faulty node detection in wireless sensor networks using a recurrent neural network

    NASA Astrophysics Data System (ADS)

    Atiga, Jamila; Mbarki, Nour Elhouda; Ejbali, Ridha; Zaied, Mourad

    2018-04-01

    The wireless sensor networks (WSN) consist of a set of sensors that are more and more used in surveillance applications on a large scale in different areas: military, Environment, Health ... etc. Despite the minimization and the reduction of the manufacturing costs of the sensors, they can operate in places difficult to access without the possibility of reloading of battery, they generally have limited resources in terms of power of emission, of processing capacity, data storage and energy. These sensors can be used in a hostile environment, such as, for example, on a field of battle, in the presence of fires, floods, earthquakes. In these environments the sensors can fail, even in a normal operation. It is therefore necessary to develop algorithms tolerant and detection of defects of the nodes for the network of sensor without wires, therefore, the faults of the sensor can reduce the quality of the surveillance if they are not detected. The values that are measured by the sensors are used to estimate the state of the monitored area. We used the Non-linear Auto- Regressive with eXogeneous (NARX), the recursive architecture of the neural network, to predict the state of a node of a sensor from the previous values described by the functions of time series. The experimental results have verified that the prediction of the State is enhanced by our proposed model.

  2. Architecture of a Biomedical Informatics Research Data Management Pipeline.

    PubMed

    Bauer, Christian R; Umbach, Nadine; Baum, Benjamin; Buckow, Karoline; Franke, Thomas; Grütz, Romanus; Gusky, Linda; Nussbeck, Sara Yasemin; Quade, Matthias; Rey, Sabine; Rottmann, Thorsten; Rienhoff, Otto; Sax, Ulrich

    2016-01-01

    In University Medical Centers, heterogeneous data are generated that cannot always be clearly attributed to patient care or biomedical research. Each data set has to adhere to distinct intrinsic and operational quality standards. However, only if high-quality data, tools to work with the data, and most importantly guidelines and rules of how to work with the data are addressed adequately, an infrastructure can be sustainable. Here, we present the IT Research Architecture of the University Medical Center Göttingen and describe our ten years' experience and lessons learned with infrastructures in networked medical research.

  3. Improving the energy efficiency of telecommunication networks

    NASA Astrophysics Data System (ADS)

    Lange, Christoph; Gladisch, Andreas

    2011-05-01

    The energy consumption of telecommunication networks has gained increasing interest throughout the recent past: Besides its environmental implications it has been identified to be a major contributor to operational expenditures of network operators. Targeting at sustainable telecommunication networks, thus, it is important to find appropriate strategies for improving their energy efficiency before the background of rapidly increasing traffic volumes. Besides the obvious benefits of increasing energy efficiency of network elements by leveraging technology progress, load-adaptive network operation is a very promising option, i.e. using network resources only to an extent and for the time they are actually needed. In contrast, current network operation takes almost no advantage of the strongly time-variant behaviour of the network traffic load. Mechanisms for energy-aware load-adaptive network operation can be subdivided in techniques based on local autonomous or per-link decisions and in techniques relying on coordinated decisions incorporating information from several links. For the transformation from current network structures and operation paradigms towards energy-efficient and sustainable networks it will be essential to use energy-optimized network elements as well as including the overall energy consumption in network design and planning phases together with the energy-aware load-adaptive operation. In load-adaptive operation it will be important to establish the optimum balance between local and overarching power management concepts in telecommunication networks.

  4. Optical Mass Displacement Tracking: A simplified field calibration method for the electro-mechanical seismometer.

    NASA Astrophysics Data System (ADS)

    Burk, D. R.; Mackey, K. G.; Hartse, H. E.

    2016-12-01

    We have developed a simplified field calibration method for use in seismic networks that still employ the classical electro-mechanical seismometer. Smaller networks may not always have the financial capability to purchase and operate modern, state of the art equipment. Therefore these networks generally operate a modern, low-cost digitizer that is paired to an existing electro-mechanical seismometer. These systems are typically poorly calibrated. Calibration of the station is difficult to estimate because coil loading, digitizer input impedance, and amplifier gain differences vary by station and digitizer model. Therefore, it is necessary to calibrate the station channel as a complete system to take into account all components from instrument, to amplifier, to even the digitizer. Routine calibrations at the smaller networks are not always consistent, because existing calibration techniques require either specialized equipment or significant technical expertise. To improve station data quality at the small network, we developed a calibration method that utilizes open source software and a commonly available laser position sensor. Using a signal generator and a small excitation coil, we force the mass of the instrument to oscillate at various frequencies across its operating range. We then compare the channel voltage output to the laser-measured mass displacement to determine the instrument voltage sensitivity at each frequency point. Using the standard equations of forced motion, a representation of the calibration curve as a function of voltage per unit of ground velocity is calculated. A computer algorithm optimizes the curve and then translates the instrument response into a Seismic Analysis Code (SAC) poles & zeros format. Results have been demonstrated to fall within a few percent of a standard laboratory calibration. This method is an effective and affordable option for networks that employ electro-mechanical seismometers, and it is currently being deployed in regional networks throughout Russia and in Central Asia.

  5. A New Quality Control Method base on IRMCD for Wind Profiler Observation towards Future Assimilation Application

    NASA Astrophysics Data System (ADS)

    Chen, Min; Zhang, Yu

    2017-04-01

    A wind profiler network with a total of 65 profiling radars was operated by the MOC/CMA in China until July 2015. In this study, a quality control procedure is constructed to incorporate the profiler data from the wind-profiling network into the local data assimilation and forecasting system (BJRUC). The procedure applies a blacklisting check that removes stations with gross errors and an outlier check that rejects data with large deviations from the background. Instead of the bi-weighting method, which has been commonly implemented in outlier elimination for one-dimensional scalar observations, an outlier elimination method is developed based on the iterated reweighted minimum covariance determinant (IRMCD) for multi-variate observations such as wind profiler data. A quality control experiment is separately performed for subsets containing profiler data tagged in parallel with/without rain flags at every 00UTC/12UTC from 20 June to 30 Sep 2015. From the results, we find that with the quality control, the frequency distributions of the differences between the observations and model background become more Gaussian-like and meet the requirements of a Gaussian distribution for data assimilation. Further intensive assessment for each quality control step reveals that the stations rejected by blacklisting contain poor data quality, and the IRMCD rejects outliers in a robust and physically reasonable manner.

  6. Water quality monitoring protocol for wadeable streams and rivers in the Northern Great Plains Network

    USGS Publications Warehouse

    Wilson, Marcia H.; Rowe, Barbara L.; Gitzen, Robert A.; Wilson, Stephen K.; Paintner-Green, Kara J.

    2014-01-01

    As recommended by Oakley et al. (2003), this protocol provides a narrative and the rationale for selection of streams and rivers within the NGPN that will be measured for water quality, including dissolved oxygen, pH, specific conductivity, and temperature. Standard operating procedures (SOPs) that detail the steps to collect, manage, and disseminate the NGPN water quality data are in an accompanying document. The sampling design documented in this protocol may be updated as monitoring information is collected and interpreted, and as refinement of methodologies develop through time. In addition, evaluation of data and refinement of the program may necessitate potential changes of program objectives. Changes to the NGPN water quality protocols and SOPs will be carefully documented in a revision history log.

  7. Real-time visual communication to aid disaster recovery in a multi-segment hybrid wireless networking system

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Wang, Qi; Grecos, Christos

    2012-06-01

    When natural disasters or other large-scale incidents occur, obtaining accurate and timely information on the developing situation is vital to effective disaster recovery operations. High-quality video streams and high-resolution images, if available in real time, would provide an invaluable source of current situation reports to the incident management team. Meanwhile, a disaster often causes significant damage to the communications infrastructure. Therefore, another essential requirement for disaster management is the ability to rapidly deploy a flexible incident area communication network. Such a network would facilitate the transmission of real-time video streams and still images from the disrupted area to remote command and control locations. In this paper, a comprehensive end-to-end video/image transmission system between an incident area and a remote control centre is proposed and implemented, and its performance is experimentally investigated. In this study a hybrid multi-segment communication network is designed that seamlessly integrates terrestrial wireless mesh networks (WMNs), distributed wireless visual sensor networks, an airborne platform with video camera balloons, and a Digital Video Broadcasting- Satellite (DVB-S) system. By carefully integrating all of these rapidly deployable, interworking and collaborative networking technologies, we can fully exploit the joint benefits provided by WMNs, WSNs, balloon camera networks and DVB-S for real-time video streaming and image delivery in emergency situations among the disaster hit area, the remote control centre and the rescue teams in the field. The whole proposed system is implemented in a proven simulator. Through extensive simulations, the real-time visual communication performance of this integrated system has been numerically evaluated, towards a more in-depth understanding in supporting high-quality visual communications in such a demanding context.

  8. In-service communication channel sensing based on reflectometry for TWDM-PON systems

    NASA Astrophysics Data System (ADS)

    Iida, Daisuke; Kuwano, Shigeru; Terada, Jun

    2014-05-01

    Many base stations are accommodated in TWDM-PON based mobile backhaul and fronthaul networks for future radio access, and failed connections in an optical network unit (ONU) wavelength channel severely degrade system performance. A cost effective in-service ONU wavelength channel monitor is essential to ensure proper system operation without failed connections. To address this issue we propose a reflectometry-based remote sensing method that provides wavelength channel information with the optical line terminal (OLT)-ONU distance. The method realizes real-time monitoring of ONU wavelength channels without signal quality degradation. Experimental results show it achieves wavelength channel distinction with high distance resolution.

  9. [The Academy of Trauma Surgery (AUC). Service provider and management organization of the DGU].

    PubMed

    Sturm, J A; Hoffmann, R

    2016-02-01

    At the beginning of this century the German Trauma Society (DGU) became extensively active with an initiative on quality promotion, development of quality assurance and transparency regarding treatment of the severely injured. A white book on "Medical care of the severely injured" was published, focusing on the requirements on structural quality and especially procedural quality. The impact of the white book was immense and a trauma network with approved trauma centers, structured and graded for their individual trauma care performance, was developed. In order to monitor and document the required quality of care, a registry was needed. Furthermore, for cooperation within the trauma networks innovative methods for digital transfer of radiological images and patient documents became necessary. Finally, the auditing criteria for trauma centers had and still have to be completed with advanced medical education and training programs. In order to realize the implementation of such a broad spectrum of economically relevant and increasingly complex activities the Academy of Trauma Surgery (AUC) was established as a subsidiary of the DGU in 2004. The AUC currently has four divisions: 1) networks and health care structures, 2) registries and research management, 3) telemedicine, 4) medical education and training, all of which serve the goal of the initiative. The AUC is a full service provider and management organization in compliance with the statutes of the DGU. According to these statutes the business operations of the AUC also cover projects for numerous groups of patients, projects for the joint society the German Society for Orthopedics and Trauma (DGOU) as well as other medical institutions. This article describes the success stories of the trauma network (TraumaNetzwerk DGU®), the TraumaRegister DGU®, the telecooperation platform TKmed®, the new and fast-growing orthogeriatric center initiative (AltersTraumaZentrum DGU®) and the division of medical education and training, e.g. advanced trauma life support (ATLS®) and other training programs including the innovative interpersonal competence (IC) course.

  10. Analysis and Research on the effect of the Operation of Small Hydropower in the Regional Power Grid

    NASA Astrophysics Data System (ADS)

    Ang, Fu; Guangde, Dong; Xiaojun, Zhu; Ruimiao, Wang; Shengyi, Zhu

    2018-03-01

    The analysis of reactive power balance and voltage of power network not only affects the system voltage quality, but also affects the economic operation of power grid. In the calculation of reactive power balance and voltage analysis in the past, the problem of low power and low system voltage has been the concern of people. When small hydropower stations in the wet period of low load, the analysis of reactive power surplus and high voltage for the system, if small hydropower unit the capability of running in phase is considered, it can effectively solve the system low operation voltage of the key point on the high side.

  11. A Community Network of 100 Black Carbon Sensors

    NASA Astrophysics Data System (ADS)

    Preble, C.; Kirchstetter, T.; Caubel, J.; Cados, T.; Keeling, C.; Chang, S.

    2017-12-01

    We developed a low-cost black carbon sensor, field tested its performance, and then built and deployed a network of 100 sensors in West Oakland, California. We operated the network for 100 days beginning mid-May 2017 to measure spatially resolved black carbon concentrations throughout the community. West Oakland is a San Francisco Bay Area mixed residential and industrial community that is adjacent to regional port and rail yard facilities and surrounded by major freeways. As such, the community is affected by diesel particulate matter emissions from heavy-duty diesel trucks, locomotives, and ships associated with freight movement. In partnership with Environmental Defense Fund, the Bay Area Air Quality Management District, and the West Oakland Environmental Indicators Project, we deployed the black carbon monitoring network outside of residences and business, along truck routes and arterial streets, and at upwind locations. The sensor employs the filter-based light transmission method to measure black carbon and has good precision and correspondence with current commercial black carbon instruments. Throughout the 100-day period, each of the 100 sensors transmitted data via a cellular network. A MySQL database was built to receive and manage the data in real-time. The database included diagnostic features to monitor each sensor's operational status and facilitate the maintenance of the network. Spatial and temporal patterns in black carbon concentrations will be presented, including patterns around industrial facilities, freeways, and truck routes, as well as the relationship between neighborhood concentrations and the BAAQMD's monitoring site. Lessons learned during this first of its kind black carbon monitoring network will also be shared.

  12. Groundwater quality data from the National Water-Quality Assessment Project, May 2012 through December 2013

    USGS Publications Warehouse

    Arnold, Terri L.; Desimone, Leslie A.; Bexfield, Laura M.; Lindsey, Bruce D.; Barlow, Jeannie R.; Kulongoski, Justin T.; Musgrove, MaryLynn; Kingsbury, James A.; Belitz, Kenneth

    2016-06-20

    Groundwater-quality data were collected from 748 wells as part of the National Water-Quality Assessment Project of the U.S. Geological Survey National Water-Quality Program from May 2012 through December 2013. The data were collected from four types of well networks: principal aquifer study networks, which assess the quality of groundwater used for public water supply; land-use study networks, which assess land-use effects on shallow groundwater quality; major aquifer study networks, which assess the quality of groundwater used for domestic supply; and enhanced trends networks, which evaluate the time scales during which groundwater quality changes. Groundwater samples were analyzed for a large number of water-quality indicators and constituents, including major ions, nutrients, trace elements, volatile organic compounds, pesticides, and radionuclides. These groundwater quality data are tabulated in this report. Quality-control samples also were collected; data from blank and replicate quality-control samples are included in this report.

  13. Adaptation of Collaborative Applications for Network Quality Variation

    DTIC Science & Technology

    2004-06-01

    collaborative application. 1.1 Quality of Service Quality of Service (QoS) is generally regarded as an end-to-end network application...using the Cloud WAN Emulator [14]. We used qtcp1 to measure end-to-end network service quality between the signal sender and the signal receiver. The...application must be aware of current resource 1 Qtcp measures end-to-end network integrity and service quality for QoS verification. Qtcp sends a

  14. Socially Aware Heterogeneous Wireless Networks

    PubMed Central

    Kosmides, Pavlos; Adamopoulou, Evgenia; Demestichas, Konstantinos; Theologou, Michael; Anagnostou, Miltiades; Rouskas, Angelos

    2015-01-01

    The development of smart cities has been the epicentre of many researchers’ efforts during the past decade. One of the key requirements for smart city networks is mobility and this is the reason stable, reliable and high-quality wireless communications are needed in order to connect people and devices. Most research efforts so far, have used different kinds of wireless and sensor networks, making interoperability rather difficult to accomplish in smart cities. One common solution proposed in the recent literature is the use of software defined networks (SDNs), in order to enhance interoperability among the various heterogeneous wireless networks. In addition, SDNs can take advantage of the data retrieved from available sensors and use them as part of the intelligent decision making process contacted during the resource allocation procedure. In this paper, we propose an architecture combining heterogeneous wireless networks with social networks using SDNs. Specifically, we exploit the information retrieved from location based social networks regarding users’ locations and we attempt to predict areas that will be crowded by using specially-designed machine learning techniques. By recognizing possible crowded areas, we can provide mobile operators with recommendations about areas requiring datacell activation or deactivation. PMID:26110402

  15. Optimizing Cluster Heads for Energy Efficiency in Large-Scale Heterogeneous Wireless Sensor Networks

    DOE PAGES

    Gu, Yi; Wu, Qishi; Rao, Nageswara S. V.

    2010-01-01

    Many complex sensor network applications require deploying a large number of inexpensive and small sensors in a vast geographical region to achieve quality through quantity. Hierarchical clustering is generally considered as an efficient and scalable way to facilitate the management and operation of such large-scale networks and minimize the total energy consumption for prolonged lifetime. Judicious selection of cluster heads for data integration and communication is critical to the success of applications based on hierarchical sensor networks organized as layered clusters. We investigate the problem of selecting sensor nodes in a predeployed sensor network to be the cluster heads tomore » minimize the total energy needed for data gathering. We rigorously derive an analytical formula to optimize the number of cluster heads in sensor networks under uniform node distribution, and propose a Distance-based Crowdedness Clustering algorithm to determine the cluster heads in sensor networks under general node distribution. The results from an extensive set of experiments on a large number of simulated sensor networks illustrate the performance superiority of the proposed solution over the clustering schemes based on k -means algorithm.« less

  16. Experimental demonstration of building and operating QoS-aware survivable vSD-EONs with transparent resiliency.

    PubMed

    Yin, Jie; Guo, Jiannan; Kong, Bingxin; Yin, Heqing; Zhu, Zuqing

    2017-06-26

    Software-defined elastic optical networks (SD-EONs) provide operators more flexibility to customize their optical infrastructures dynamically. By leveraging infrastructure-as-a-service (IaaS), virtual SD-EONs (vSD-EONs) can be realized to further enhance the adaptivity of SD-EONs and shorten the time-to-market of new services. In this paper, we design and demonstrate the building and operating of quality-of-service (QoS) aware survivable vSD-EONs that are equipped with transparent data plane (DP) resiliency. Specifically, when slicing a vSD-EON, our network hypervisor (NHV) chooses to use "1:1" virtual link (VL) protection or on-demand VL remapping as the DP restoration scheme, according to the service-level agreement (SLA) between the vSD-EON's operator and the infrastructure provider (InP). Then, during an actual substrate link (SL) failure, the NHV realizes automatic DP restoration that is transparent to the controllers of vSD-EONs. We build a network testbed to demonstrate the creation of QoS-aware survivable vSD-EONs, the activation of lightpaths in the vSD-EONs to support upper-layer applications, and the automatic and simultaneous QoS-aware DP restorations during an SL failure. The experimental results indicate that our vSD-EON slicing system can build QoS-aware survivable vSD-EONs on-demand, operate them to set up lightpaths for carrying real application traffic, and facilitate differentiated DP restorations during SL failures to recover the vSD-EONs' services according to their SLAs.

  17. Comparison of Data Quality of NOAA's ISIS and SURFRAD Networks to NREL's SRRL-BMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderberg, M.; Sengupta, M.

    2014-11-01

    This report provides analyses of broadband solar radiometric data quality for the National Oceanic and Atmospheric Administration's Integrated Surface Irradiance Study and Surface Radiation Budget Network (SURFRAD) solar measurement networks. The data quality of these networks is compared to that of the National Renewable Energy Laboratory's Solar Radiation Research Laboratory Baseline Measurement System (SRRL-BMS) native data resolutions and hourly averages of the data from the years 2002 through 2013. This report describes the solar radiometric data quality testing and flagging procedures and the method used to determine and tabulate data quality statistics. Monthly data quality statistics for each network weremore » plotted by year against the statistics for the SRRL-BMS. Some of the plots are presented in the body of the report, but most are in the appendix. These plots indicate that the overall solar radiometric data quality of the SURFRAD network is superior to that of the Integrated Surface Irradiance Study network and can be comparable to SRRL-BMS.« less

  18. Perceptual tools for quality-aware video networks

    NASA Astrophysics Data System (ADS)

    Bovik, A. C.

    2014-01-01

    Monitoring and controlling the quality of the viewing experience of videos transmitted over increasingly congested networks (especially wireless networks) is a pressing problem owing to rapid advances in video-centric mobile communication and display devices that are straining the capacity of the network infrastructure. New developments in automatic perceptual video quality models offer tools that have the potential to be used to perceptually optimize wireless video, leading to more efficient video data delivery and better received quality. In this talk I will review key perceptual principles that are, or could be used to create effective video quality prediction models, and leading quality prediction models that utilize these principles. The goal is to be able to monitor and perceptually optimize video networks by making them "quality-aware."

  19. GCOS reference upper air network (GRUAN): Steps towards assuring future climate records

    NASA Astrophysics Data System (ADS)

    Thorne, P. W.; Vömel, H.; Bodeker, G.; Sommer, M.; Apituley, A.; Berger, F.; Bojinski, S.; Braathen, G.; Calpini, B.; Demoz, B.; Diamond, H. J.; Dykema, J.; Fassò, A.; Fujiwara, M.; Gardiner, T.; Hurst, D.; Leblanc, T.; Madonna, F.; Merlone, A.; Mikalsen, A.; Miller, C. D.; Reale, T.; Rannat, K.; Richter, C.; Seidel, D. J.; Shiotani, M.; Sisterson, D.; Tan, D. G. H.; Vose, R. S.; Voyles, J.; Wang, J.; Whiteman, D. N.; Williams, S.

    2013-09-01

    The observational climate record is a cornerstone of our scientific understanding of climate changes and their potential causes. Existing observing networks have been designed largely in support of operational weather forecasting and continue to be run in this mode. Coverage and timeliness are often higher priorities than absolute traceability and accuracy. Changes in instrumentation used in the observing system, as well as in operating procedures, are frequent, rarely adequately documented and their impacts poorly quantified. For monitoring changes in upper-air climate, which is achieved through in-situ soundings and more recently satellites and ground-based remote sensing, the net result has been trend uncertainties as large as, or larger than, the expected emergent signals of climate change. This is more than simply academic with the tropospheric temperature trends issue having been the subject of intense debate, two international assessment reports and several US congressional hearings. For more than a decade the international climate science community has been calling for the instigation of a network of reference quality measurements to reduce uncertainty in our climate monitoring capabilities. This paper provides a brief history of GRUAN developments to date and outlines future plans. Such reference networks can only be achieved and maintained with strong continuing input from the global metrological community.

  20. Application of neural networks to software quality modeling of a very large telecommunications system.

    PubMed

    Khoshgoftaar, T M; Allen, E B; Hudepohl, J P; Aud, S J

    1997-01-01

    Society relies on telecommunications to such an extent that telecommunications software must have high reliability. Enhanced measurement for early risk assessment of latent defects (EMERALD) is a joint project of Nortel and Bell Canada for improving the reliability of telecommunications software products. This paper reports a case study of neural-network modeling techniques developed for the EMERALD system. The resulting neural network is currently in the prototype testing phase at Nortel. Neural-network models can be used to identify fault-prone modules for extra attention early in development, and thus reduce the risk of operational problems with those modules. We modeled a subset of modules representing over seven million lines of code from a very large telecommunications software system. The set consisted of those modules reused with changes from the previous release. The dependent variable was membership in the class of fault-prone modules. The independent variables were principal components of nine measures of software design attributes. We compared the neural-network model with a nonparametric discriminant model and found the neural-network model had better predictive accuracy.

  1. European project RETAIN: new approach for IBC in teleradiology and PACS based on full ATM network

    NASA Astrophysics Data System (ADS)

    Cordonnier, Emmanuel; Jensch, Peter F.; Piqueras, Joachim; Gandon, Yves

    1995-05-01

    This paper describes the RETAIN project (radiological examination transfer on ATM Integrated Network), which is supported by the European Community, in the frame of the TEN-IBC program (trans-European networks integrated broad band communication). It links together three European sites in France (Rennes), Spain (Barcelona), and Germany (Oldenburg) and involves a partnership between the public national operators France Telecom, Telefonica, and Telekom. One important reason to explicitly consider asynchronous transfer mode (ATM) for medical imaging is that multimedia applications on such networks allow integration of digital data and person-to-person communication. The RETAIN project includes trials of teleworking sessions between radiologists of Rennes and Barcelona within a clinical and/or scientific context based on ATM equipments performing DICOM transfer on examination, digital remote manipulation within a comprehensive dialogue, and high quality visiophony on ATM adaptation layer (AAL) type 1. The project includes also visiophony trials with Oldenburg and preparation of harmonized regional experimentation within an emergency context. The network used is a full 10 Mbits/s ATM network directly connected to local PACSs.

  2. Reference hydrologic networks I. The status and potential future directions of national reference hydrologic networks for detecting trends

    USGS Publications Warehouse

    Whitfield, Paul H.; Burn, Donald H.; Hannaford, Jamie; Higgins, Hélène; Hodgkins, Glenn A.; Marsh, Terry; Looser, Ulrich

    2012-01-01

    Identifying climate-driven trends in river flows on a global basis is hampered by a lack of long, quality time series data for rivers with relatively undisturbed regimes. This is a global problem compounded by the lack of support for essential long-term monitoring. Experience demonstrates that, with clear strategic objectives, and the support of sponsoring organizations, reference hydrologic networks can constitute an exceptionally valuable data source to effectively identify, quantify and interpret hydrological change—the speed and magnitude of which is expected to a be a primary driver of water management and flood alleviation strategies through the future—and for additional applications. Reference hydrologic networks have been developed in many countries in the past few decades. These collections of streamflow gauging stations, that are maintained and operated with the intention of observing how the hydrology of watersheds responds to variations in climate, are described. The status of networks under development is summarized. We suggest a plan of actions to make more effective use of this collection of networks.

  3. Status report on the USGS component of the Global Seismographic Network

    NASA Astrophysics Data System (ADS)

    Gee, L. S.; Bolton, H. F.; Derr, J.; Ford, D.; Gyure, G.; Hutt, C. R.; Ringler, A.; Storm, T.; Wilson, D.

    2010-12-01

    As recently as four years ago, the average age of a datalogger in the portion of the Global Seismographic Network (GSN) operated by the United States Geological Survey (USGS) was 16 years - an eternity in the lifetime of computers. The selection of the Q330HR in 2006 as the “next generation” datalogger by an Incorporated Research Institutions for Seismology (IRIS) selection committee opened the door for upgrading the GSN. As part of the “next generation” upgrades, the USGS is replacing a single Q680 system with two Q330HRs and a field processor to provide the same capability. The functionality includes digitizing, timing, event detection, conversion into miniSEED records, archival of miniSEED data on the ASP and telemetry of the miniSEED data using International Deployment of Accelerometers (IDA) Authenticated Disk Protocol (IACP). At many sites, Quanterra Balers are also being deployed. The Q330HRs feature very low power consumption (which will increase reliability) and higher resolution than the Q680 systems. Furthermore, this network-wide upgrade provides the opportunity to correct known station problems, standardize the installation of secondary sensors and accelerometers, replace the feedback electronics of STS-1 sensors, and perform checks of absolute system sensitivity and sensor orientation. The USGS upgrades began with ANMO in May, 2008. Although we deployed Q330s at KNTN and WAKE in the fall of 2007 (and in the installation of the Caribbean network), these deployments did not include the final software configuration for the GSN upgrades. Following this start, the USGS installed six additional sites in FY08. With funding from the American Recovery and Reinvestment Act and the USGS GSN program, 14 stations were upgraded in FY09. Twenty-one stations are expected to be upgraded in FY10. These systematic network-wide upgrades will improve the reliability and data quality of the GSN, with the end goal of providing the Earth science community high quality seismic data with global coverage. The Global Seismographic Network is operated as a partnership among the National Science Foundation, IRIS, IDA, and the USGS.

  4. Development and Applications of a New, High-Resolution, Operational MISR Aerosol Product

    NASA Astrophysics Data System (ADS)

    Garay, M. J.; Diner, D. J.; Kalashnikova, O.

    2014-12-01

    Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been providing aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution. Capitalizing on the capabilities provided by multi-angle viewing, the operational MISR algorithm performs well, with about 75% of MISR AOD retrievals falling within 0.05 or 20% × AOD of the paired validation data from the ground-based Aerosol Robotic Network (AERONET), and is able to distinguish aerosol particles by size and sphericity, over both land and water. These attributes enable a variety of applications, including aerosol transport model validation and global air quality assessment. Motivated by the adverse impacts of aerosols on human health at the local level, and taking advantage of computational speed advances that have occurred since the launch of Terra, we have implemented an operational MISR aerosol product with 4.4 km spatial resolution that maintains, and sometimes improves upon, the quality of the 17.6 km resolution product. We will describe the performance of this product relative to the heritage 17.6 km product, the global AERONET validation network, and high spatial density AERONET-DRAGON sites. Other changes that simplify product content, and make working with the data much easier for users, will also be discussed. Examples of how the new product demonstrates finer spatial variability of aerosol fields than previously retrieved, and ways this new dataset can be used for studies of local aerosol effects, will be shown.

  5. A proposal for an SDN-based SIEPON architecture

    NASA Astrophysics Data System (ADS)

    Khalili, Hamzeh; Sallent, Sebastià; Piney, José Ramón; Rincón, David

    2017-11-01

    Passive Optical Network (PON) elements such as Optical Line Terminal (OLT) and Optical Network Units (ONUs) are currently managed by inflexible legacy network management systems. Software-Defined Networking (SDN) is a new networking paradigm that improves the operation and management of networks. In this paper, we propose a novel architecture, based on the SDN concept, for Ethernet Passive Optical Networks (EPON) that includes the Service Interoperability standard (SIEPON). In our proposal, the OLT is partially virtualized and some of its functionalities are allocated to the core network management system, while the OLT itself is replaced by an OpenFlow (OF) switch. A new MultiPoint MAC Control (MPMC) sublayer extension based on the OpenFlow protocol is presented. This would allow the SDN controller to manage and enhance the resource utilization, flow monitoring, bandwidth assignment, quality-of-service (QoS) guarantees, and energy management of the optical network access, to name a few possibilities. The OpenFlow switch is extended with synchronous ports to retain the time-critical nature of the EPON network. OpenFlow messages are also extended with new functionalities to implement the concept of EPON Service Paths (ESPs). Our simulation-based results demonstrate the effectiveness of the new architecture, while retaining a similar (or improved) performance in terms of delay and throughput when compared to legacy PONs.

  6. Assessment of Gas Potential in the Niobrara Formation, Rosebud Reservation, South Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, Aubrey E.; Hopkinson, Leslie; Soeder, Daniel

    2016-01-23

    Surface water and groundwater risks associated with unconventional oil and gas development result from potential spills of the large volumes of chemicals stored on-site during drilling and hydraulic fracturing operations, and the return to the surface of significant quantities of saline water produced during oil or gas well production. To better identify and mitigate risks, watershed models and tools are needed to evaluate the dispersion of pollutants in possible spill scenarios. This information may be used to determine the placement of in-stream water-quality monitoring instruments and to develop early-warning systems and emergency plans. A chemical dispersion model has been usedmore » to estimate the contaminant signal for in-stream measurements. Spills associated with oil and gas operations were identified within the Susquehanna River Basin Commission’s Remote Water Quality Monitoring Network. The volume of some contaminants was found to be sufficient to affect the water quality of certain drainage areas. The most commonly spilled compounds and expected peak concentrations at monitoring stations were used in laboratory experiments to determine if a signal could be detected and positively identified using standard water-quality monitoring equipment. The results were compared to historical data and baseline observations of water quality parameters, and showed that the chemicals tested do commonly affect water quality parameters. This work is an effort to demonstrate that hydrologic and water quality models may be applied to improve the placement of in-stream water quality monitoring devices. This information may increase the capability of early-warning systems to alert community health and environmental agencies of surface water spills associated with unconventional oil and gas operations.« less

  7. Mobile phone collection, reuse and recycling in the UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ongondo, F.O.; Williams, I.D., E-mail: idw@soton.ac.uk

    Highlights: > We characterized the key features of the voluntary UK mobile phone takeback network via a survey. > We identified 3 flows: information; product (handsets and accessories); and incentives. > There has been a significant rise in the number of UK takeback schemes since 1997. > Most returned handsets are low quality; little data exists on quantities of mobile phones collected. > Takeback schemes increasingly divert EoL mobile phones from landfill and enable reuse/recycling. - Abstract: Mobile phones are the most ubiquitous electronic product on the globe. They have relatively short lifecycles and because of their (perceived) in-built obsolescence,more » discarded mobile phones represent a significant and growing problem with respect to waste electrical and electronic equipment (WEEE). An emerging and increasingly important issue for industry is the shortage of key metals, especially the types of metals found in mobile phones, and hence the primary aim of this timely study was to assess and evaluate the voluntary mobile phone takeback network in the UK. The study has characterised the information, product and incentives flows in the voluntary UK mobile phone takeback network and reviewed the merits and demerits of the incentives offered. A survey of the activities of the voluntary mobile phone takeback schemes was undertaken in 2008 to: identify and evaluate the takeback schemes operating in the UK; determine the target groups from whom handsets are collected; and assess the collection, promotion and advertising methods used by the schemes. In addition, the survey sought to identify and critically evaluate the incentives offered by the takeback schemes, evaluate their ease and convenience of use; and determine the types, qualities and quantities of mobile phones they collect. The study has established that the UK voluntary mobile phone takeback network can be characterised as three distinctive flows: information flow; product flow (handsets and related accessories); and incentives flow. Over 100 voluntary schemes offering online takeback of mobile phone handsets were identified. The schemes are operated by manufacturers, retailers, mobile phone network service operators, charities and by mobile phone reuse, recycling and refurbishing companies. The latter two scheme categories offer the highest level of convenience and ease of use to their customers. Approximately 83% of the schemes are either for-profit/commercial-oriented and/or operate to raise funds for charities. The voluntary schemes use various methods to collect mobile phones from consumers, including postal services, courier and in-store. The majority of schemes utilise and finance pre-paid postage to collect handsets. Incentives offered by the takeback schemes include monetary payments, donation to charity and entry into prize draws. Consumers from whom handsets and related equipment are collected include individuals, businesses, schools, colleges, universities, charities and clubs with some schemes specialising on collecting handsets from one target group. The majority (84.3%) of voluntary schemes did not provide information on their websites about the quantities of mobile phones they collect. The operations of UK takeback schemes are decentralised in nature. Comparisons are made between the UK's decentralised collection system versus Australia's centralised network for collection of mobile phones. The significant principal conclusions from the study are: there has been a significant rise in the number of takeback schemes operating in the UK since the initial scheme was launched in 1997; the majority of returned handsets seem to be of low quality; and there is very little available information on the quantities of mobile phones collected by the various schemes. Irrespective of their financial motives, UK takeback schemes increasingly play an important role in sustainable waste management by diverting EoL mobile phones from landfills and encouraging reuse and recycling. Recommendations for future actions to improve the management of end-of-life mobile phone handsets and related accessories are made.« less

  8. Worldwide differential GPS for Space Shuttle landing operations

    NASA Technical Reports Server (NTRS)

    Loomis, Peter V. W.; Denaro, Robert P.; Saunders, Penny

    1990-01-01

    Worldwide differential Global Positioning System (WWDGPS) is viewed as an effective method of offering continuous high-quality navigation worldwide. The concept utilizes a network with as few as 33 ground stations to observe most of the error sources of GPS and provide error corrections to users on a worldwide basis. The WWDGPS real-time GPS tracking concept promises a threefold or fourfold improvement in accuracy for authorized dual-frequency users, and in addition maintains an accurate and current ionosphere model for single-frequency users. A real-time global tracking network also has the potential to reverse declarations of poor health on marginal satellites, increasing the number of satellites in the constellation and lessening the probability of GPS navigation outage. For Space Shuttle operations, the use of WWDGPS-aided P-code equipment promises performance equal to or better than other current landing guidance systems in terms of accuracy and reliability. This performance comes at significantly less cost to NASA, which will participate as a customer in a system designed as a commercial operation serving the global civil navigation community.

  9. a Modified Genetic Algorithm for Finding Fuzzy Shortest Paths in Uncertain Networks

    NASA Astrophysics Data System (ADS)

    Heidari, A. A.; Delavar, M. R.

    2016-06-01

    In realistic network analysis, there are several uncertainties in the measurements and computation of the arcs and vertices. These uncertainties should also be considered in realizing the shortest path problem (SPP) due to the inherent fuzziness in the body of expert's knowledge. In this paper, we investigated the SPP under uncertainty to evaluate our modified genetic strategy. We improved the performance of genetic algorithm (GA) to investigate a class of shortest path problems on networks with vague arc weights. The solutions of the uncertain SPP with considering fuzzy path lengths are examined and compared in detail. As a robust metaheuristic, GA algorithm is modified and evaluated to tackle the fuzzy SPP (FSPP) with uncertain arcs. For this purpose, first, a dynamic operation is implemented to enrich the exploration/exploitation patterns of the conventional procedure and mitigate the premature convergence of GA technique. Then, the modified GA (MGA) strategy is used to resolve the FSPP. The attained results of the proposed strategy are compared to those of GA with regard to the cost, quality of paths and CPU times. Numerical instances are provided to demonstrate the success of the proposed MGA-FSPP strategy in comparison with GA. The simulations affirm that not only the proposed technique can outperform GA, but also the qualities of the paths are effectively improved. The results clarify that the competence of the proposed GA is preferred in view of quality quantities. The results also demonstrate that the proposed method can efficiently be utilized to handle FSPP in uncertain networks.

  10. 47 CFR 36.353 - Network operations expenses-Account 6530 (Class B telephone companies); Accounts 6531, 6532, 6533...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Network operations expenses-Account 6530 (Class... Expenses and Taxes Network Operations Expenses § 36.353 Network operations expenses—Account 6530 (Class B... account includes the expenses associated with the provisions of power, network administration, testing...

  11. 47 CFR 36.353 - Network operations expenses-Account 6530 (Class B telephone companies); Accounts 6531, 6532, 6533...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Network operations expenses-Account 6530 (Class... Expenses and Taxes Network Operations Expenses § 36.353 Network operations expenses—Account 6530 (Class B... account includes the expenses associated with the provisions of power, network administration, testing...

  12. Methods of practice and guidelines for using survey-grade global navigation satellite systems (GNSS) to establish vertical datum in the United States Geological Survey

    USGS Publications Warehouse

    Rydlund, Jr., Paul H.; Densmore, Brenda K.

    2012-01-01

    Geodetic surveys have evolved through the years to the use of survey-grade (centimeter level) global positioning to perpetuate and post-process vertical datum. The U.S. Geological Survey (USGS) uses Global Navigation Satellite Systems (GNSS) technology to monitor natural hazards, ensure geospatial control for climate and land use change, and gather data necessary for investigative studies related to water, the environment, energy, and ecosystems. Vertical datum is fundamental to a variety of these integrated earth sciences. Essentially GNSS surveys provide a three-dimensional position x, y, and z as a function of the North American Datum of 1983 ellipsoid and the most current hybrid geoid model. A GNSS survey may be approached with post-processed positioning for static observations related to a single point or network, or involve real-time corrections to provide positioning "on-the-fly." Field equipment required to facilitate GNSS surveys range from a single receiver, with a power source for static positioning, to an additional receiver or network communicated by radio or cellular for real-time positioning. A real-time approach in its most common form may be described as a roving receiver augmented by a single-base station receiver, known as a single-base real-time (RT) survey. More efficient real-time methods involving a Real-Time Network (RTN) permit the use of only one roving receiver that is augmented to a network of fixed receivers commonly known as Continually Operating Reference Stations (CORS). A post-processed approach in its most common form involves static data collection at a single point. Data are most commonly post-processed through a universally accepted utility maintained by the National Geodetic Survey (NGS), known as the Online Position User Service (OPUS). More complex post-processed methods involve static observations among a network of additional receivers collecting static data at known benchmarks. Both classifications provide users flexibility regarding efficiency and quality of data collection. Quality assurance of survey-grade global positioning is often overlooked or not understood and perceived uncertainties can be misleading. GNSS users can benefit from a blueprint of data collection standards used to ensure consistency among USGS mission areas. A classification of GNSS survey qualities provide the user with the ability to choose from the highest quality survey used to establish objective points with low uncertainties, identified as a Level I, to a GNSS survey for general topographic control without quality assurance, identified as a Level IV. A Level I survey is strictly limited to post-processed methods, whereas Level II, Level III, and Level IV surveys integrate variations of a RT approach. Among these classifications, techniques involving blunder checks and redundancy are important, and planning that involves the assessment of the overall satellite configuration, as well as terrestrial and space weather, are necessary to ensure an efficient and quality campaign. Although quality indicators and uncertainties are identified in post-processed methods using CORS, the accuracy of a GNSS survey is most effectively expressed as a comparison to a local benchmark that has a high degree of confidence. Real-time and post-processed methods should incorporate these "trusted" benchmarks as a check during any campaign. Global positioning surveys are expected to change rapidly in the future. The expansion of continuously operating reference stations, combined with newly available satellite signals, and enhancements to the conterminous geoid, are all sufficient indicators for substantial growth in real-time positioning and quality thereof.

  13. Noise Characteristics of EarthScope Transportable Array Posthole Sensor Emplacements in Alaska and Canada

    NASA Astrophysics Data System (ADS)

    Aderhold, K.; Frassetto, A.; Busby, R. W.; Enders, M.; Bierma, R. M.; Miner, J.; Woodward, R.

    2016-12-01

    From 2011 to 2015, IRIS has built or upgraded 67 broadband seismic stations in Alaska and western Canada as part of the EarthScope Transportable Array (TA) program. An additional 72 stations will be completed by the fall of 2016. Nearly all use new posthole seismometers, emplaced at 3 m depth in cased holes within fractured bedrock outcrops, permafrost, or soil. Based on initial tests in Alaska, New Mexico, and California, this emplacement technique was chosen to streamline logistics in challenging, remote conditions as well as optimize station performance. A versatile drill capable of operating with a hammer bit or auger was developed specifically for the TA and is light enough to be transported by helicopter in a single load. The drilling system is ideal for TA deployment logistics in Alaska, but could be adapted to many regional or permanent network operations because it is easily transported on a flatbed truck and manuevered into tight working locations. The TA will complete another 73 installations in 2017 and operate the full network of 268 real-time stations through at least 2019. The removal of some TA stations is planned for 2020, but upgrades to existing stations are permanent contributions to these networks. The TA stations are a proof of concept for a new approach to emplacement of seismometers across a large network and will enable high-quality scientific research as well as advances in hazard monitoring. To evaluate the new and upgraded stations, we use probability density functions of hourly power spectral density computed by the IRIS DMC MUSTANG metric service for the continuous data recorded through 2016. Our results show that the noise performance of TA postholes in Alaska and Canada show significant improvement over the tank vaults of the lower-48 TA. With an ideal posthole drilled into bedrock or permafrost, noise levels can approach the quality of GSN stations particularly on the horizontal channels at long periods [>70 seconds]. Stations also display a strong but expected regional and seasonal variation. We provide notable examples of station performance, focusing on regional trends as well as the performance of stations upgraded from surface vault to posthole configuration.

  14. Network design for telemedicine--e-health using satellite technology.

    PubMed

    Graschew, Georgi; Roelofs, Theo A; Rakowsky, Stefan; Schlag, Peter M

    2008-01-01

    Over the last decade various international Information and Communications Technology networks have been created for a global access to high-level medical care. OP 2000 has designed and validated the high-end interactive video communication system WinVicos especially for telemedical applications, training of the physician in a distributed environment, teleconsultation and second opinion. WinVicos is operated on a workstation (WoTeSa) using standard hardware components and offers a superior image quality at a moderate transmission bandwidth of up to 2 Mbps. WoTeSa / WinVicos have been applied for IP-based communication in different satellite-based telemedical networks. In the DELTASS-project a disaster scenario was analysed and an appropriate telecommunication system for effective rescue measures for the victims was set up and evaluated. In the MEDASHIP project an integrated system for telemedical services (teleconsultation, teleelectro-cardiography, telesonography) on board of cruise ships and ferries has been set up. EMISPHER offers an equal access for most of the countries of the Euro-Mediterranean area to on-line services for health care in the required quality of service. E-learning applications, real-time telemedicine and shared management of medical assistance have been realized. The innovative developments in ICT with the aim of realizing a ubiquitous access to medical resources for everyone at any time and anywhere (u-Health) bear the risk of creating and amplifying a digital divide in the world. Therefore we have analyzed how the objective needs of the heterogeneous partners can be joined with the result that there is a need for real integration of the various platforms and services. A virtual combination of applications serves as the basic idea for the Virtual Hospital. The development of virtual hospitals and digital medicine helps to bridge the digital divide between different regions of the world and enables equal access to high-level medical care. Pre-operative planning, intra-operative navigation and minimally-invasive surgery require a digital and virtual environment supporting the perception of the physician. As data and computing resources in a virtual hospital are distributed over many sites the concept of the Grid should be integrated with other communication networks and platforms.

  15. Wavelength dependent light absorption as a cost effective, real-time surrogate for ambient concentrations of polycyclic aromatic hydrocarbons

    NASA Astrophysics Data System (ADS)

    Brown, Richard J. C.; Butterfield, David M.; Goddard, Sharon L.; Hussain, Delwar; Quincey, Paul G.; Fuller, Gary W.

    2016-02-01

    Many monitoring stations used to assess ambient air concentrations of pollutants regulated by European air quality directives suffer from being expensive to establish and operate, and from their location being based on the results of macro-scale modelling exercises rather than measurement assessments in candidate locations. To address these issues for the monitoring of polycyclic aromatic hydrocarbons (PAHs), this study has used data from a combination of the ultraviolet and infrared channels of aethalometers (referred to as UV BC), operated as part of the UK Black Carbon Network, as a surrogate measurement. This has established a relationship between concentrations of the PAH regulated in Europe, benzo[a]pyrene (B[a]P), and the UV BC signal at locations where these measurements have been made together from 2008 to 2014. This relationship was observed to be non-linear. Relationships for individual site types were used to predict measured concentrations with, on average, 1.5% accuracy across all annual averages, and with only 1 in 36 of the predicted annual averages deviating from the measured annual average by more than the B[a]P data quality objective for uncertainty of 50% (at -65%, with the range excluding this value between + 38% and -37%). These relationships were then used to predict B[a]P concentrations at stations where UV BC measurement are made, but PAH measurements are not. This process produced results which reflected expectations based on knowledge of the pollution climate at these stations gained from the measurements of other air quality networks, or from nearby stations. The influence of domestic solid fuel heating was clear using this approach which highlighted Strabane in Northern Ireland as a station likely to be in excess of the air quality directive target value for B[a]P.

  16. Air quality impact assessment of multiple open pit coal mines in northern Colombia.

    PubMed

    Huertas, José I; Huertas, María E; Izquierdo, Sebastián; González, Enrique D

    2012-01-01

    The coal mining region in northern Colombia is one of the largest open pit mining regions of the world. In 2009, there were 8 mining companies in operation with an approximate coal production of ∼70 Mtons/year. Since 2007, the Colombian air quality monitoring network has reported readings that exceed the daily and annual air quality standards for total suspended particulate (TSP) matter and particles with an equivalent aerodynamic diameter smaller than 10 μm (PM₁₀) in nearby villages. This paper describes work carried out in order to establish an appropriate clean air program for this region, based on the Colombian national environmental authority requirement for modeling of TSP and PM(10) dispersion. A TSP and PM₁₀ emission inventory was initially developed, and topographic and meteorological information for the region was collected and analyzed. Using this information, the dispersion of TSP was modeled in ISC3 and AERMOD using meteorological data collected by 3 local stations during 2008 and 2009. The results obtained were compared to actual values measured by the air quality monitoring network. High correlation coefficients (>0.73) were obtained, indicating that the models accurately described the main factors affecting particle dispersion in the region. The model was then used to forecast concentrations of particulate matter for 2010. Based on results from the model, areas within the modeling region were identified as highly, fairly, moderately and marginally polluted according to local regulations. Additionally, the contribution particulate matter to the pollution at each village was estimated. Using these predicted values, the Colombian environmental authority imposed new decontamination measures on the mining companies operating in the region. These measures included the relocation of three villages financed by the mine companies based on forecasted pollution levels. Copyright © 2011. Published by Elsevier Ltd.

  17. Case management for high-intensity service users: towards a relational approach to care co-ordination.

    PubMed

    McEvoy, Phil; Escott, Diane; Bee, Penny

    2011-01-01

    This study is based on a formative evaluation of a case management service for high-intensity service users in Northern England. The evaluation had three main purposes: (i) to assess the quality of the organisational infrastructure; (ii) to obtain a better understanding of the key influences that played a role in shaping the development of the service; and (iii) to identify potential changes in practice that may help to improve the quality of service provision. The evaluation was informed by Gittell's relational co-ordination theory, which focuses upon cross-boundary working practices that facilitate task integration. The Assessment of Chronic Illness Care Survey was used to assess the organisational infrastructure and qualitative interviews with front line staff were conducted to explore the key influences that shaped the development of the service. A high level of strategic commitment and political support for integrated working was identified. However, the quality of care co-ordination was variable. The most prominent operational factor that appeared to influence the scope and quality of care co-ordination was the pattern of interaction between the case managers and their co-workers. The co-ordination of patient care was much more effective in integrated co-ordination networks. Key features included clearly defined, task focussed, relational workspaces with interactive forums where case managers could engage with co-workers in discussions about the management of interdependent care activities. In dispersed co-ordination networks with fewer relational workspaces, the case managers struggled to work as effectively. The evaluation concluded that the creation of flexible and efficient task focused relational workspaces that are systemically managed and adequately resourced could help to improve the quality of care co-ordination, particularly in dispersed networks. © 2010 Blackwell Publishing Ltd.

  18. Real-time product attribute control to manufacture antibodies with defined N-linked glycan levels.

    PubMed

    Zupke, Craig; Brady, Lowell J; Slade, Peter G; Clark, Philip; Caspary, R Guy; Livingston, Brittney; Taylor, Lisa; Bigham, Kyle; Morris, Arvia E; Bailey, Robert W

    2015-01-01

    Pressures for cost-effective new therapies and an increased emphasis on emerging markets require technological advancements and a flexible future manufacturing network for the production of biologic medicines. The safety and efficacy of a product is crucial, and consistent product quality is an essential feature of any therapeutic manufacturing process. The active control of product quality in a typical biologic process is challenging because of measurement lags and nonlinearities present in the system. The current study uses nonlinear model predictive control to maintain a critical product quality attribute at a predetermined value during pilot scale manufacturing operations. This approach to product quality control ensures a more consistent product for patients, enables greater manufacturing efficiency, and eliminates the need for extensive process characterization by providing direct measures of critical product quality attributes for real time release of drug product. © 2015 American Institute of Chemical Engineers.

  19. A Data Quality Filter for PMU Measurements: Description, Experience, and Examples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Follum, James D.; Amidan, Brett G.

    Networks of phasor measurement units (PMUs) continue to grow, and along with them, the amount of data available for analysis. With so much data, it is impractical to identify and remove poor quality data manually. The data quality filter described in this paper was developed for use with the Data Integrity and Situation Awareness Tool (DISAT), which analyzes PMU data to identify anomalous system behavior. The filter operates based only on the information included in the data files, without supervisory control and data acquisition (SCADA) data, state estimator values, or system topology information. Measurements are compared to preselected thresholds tomore » determine if they are reliable. Along with the filter's description, examples of data quality issues from application of the filter to nine months of archived PMU data are provided. The paper is intended to aid the reader in recognizing and properly addressing data quality issues in PMU data.« less

  20. An assessment of technology-based service encounters & network security on the e-health care systems of medical centers in Taiwan

    PubMed Central

    Chang, Hsin Hsin; Chang, Ching Sheng

    2008-01-01

    Background Enhancing service efficiency and quality has always been one of the most important factors to heighten competitiveness in the health care service industry. Thus, how to utilize information technology to reduce work load for staff and expeditiously improve work efficiency and healthcare service quality is presently the top priority for every healthcare institution. In this fast changing modern society, e-health care systems are currently the best possible way to achieve enhanced service efficiency and quality under the restraint of healthcare cost control. The electronic medical record system and the online appointment system are the core features in employing e-health care systems in the technology-based service encounters. Methods This study implemented the Service Encounters Evaluation Model, the European Customer Satisfaction Index, the Attribute Model and the Overall Affect Model for model inference. A total of 700 copies of questionnaires from two authoritative southern Taiwan medical centers providing the electronic medical record system and the online appointment system service were distributed, among which 590 valid copies were retrieved with a response rate of 84.3%. We then used SPSS 11.0 and the Linear Structural Relationship Model (LISREL 8.54) to analyze and evaluate the data. Results The findings are as follows: (1) Technology-based service encounters have a positive impact on service quality, but not patient satisfaction; (2) After experiencing technology-based service encounters, the cognition of the service quality has a positive effect on patient satisfaction; and (3) Network security contributes a positive moderating effect on service quality and patient satisfaction. Conclusion It revealed that the impact of electronic workflow (online appointment system service) on service quality was greater than electronic facilities (electronic medical record systems) in technology-based service encounters. Convenience and credibility are the most important factors of service quality in technology-based service encounters that patients demand. Due to the openness of networks, patients worry that transaction information could be intercepted; also, the credibility of the hospital involved is even a bigger concern, as patients have a strong sense of distrust. Therefore, in the operation of technology-based service encounters, along with providing network security, it is essential to build an atmosphere of psychological trust. PMID:18419820

  1. An assessment of technology-based service encounters & network security on the e-health care systems of medical centers in Taiwan.

    PubMed

    Chang, Hsin Hsin; Chang, Ching Sheng

    2008-04-17

    Enhancing service efficiency and quality has always been one of the most important factors to heighten competitiveness in the health care service industry. Thus, how to utilize information technology to reduce work load for staff and expeditiously improve work efficiency and healthcare service quality is presently the top priority for every healthcare institution. In this fast changing modern society, e-health care systems are currently the best possible way to achieve enhanced service efficiency and quality under the restraint of healthcare cost control. The electronic medical record system and the online appointment system are the core features in employing e-health care systems in the technology-based service encounters. This study implemented the Service Encounters Evaluation Model, the European Customer Satisfaction Index, the Attribute Model and the Overall Affect Model for model inference. A total of 700 copies of questionnaires from two authoritative southern Taiwan medical centers providing the electronic medical record system and the online appointment system service were distributed, among which 590 valid copies were retrieved with a response rate of 84.3%. We then used SPSS 11.0 and the Linear Structural Relationship Model (LISREL 8.54) to analyze and evaluate the data. The findings are as follows: (1) Technology-based service encounters have a positive impact on service quality, but not patient satisfaction; (2) After experiencing technology-based service encounters, the cognition of the service quality has a positive effect on patient satisfaction; and (3) Network security contributes a positive moderating effect on service quality and patient satisfaction. It revealed that the impact of electronic workflow (online appointment system service) on service quality was greater than electronic facilities (electronic medical record systems) in technology-based service encounters. Convenience and credibility are the most important factors of service quality in technology-based service encounters that patients demand. Due to the openness of networks, patients worry that transaction information could be intercepted; also, the credibility of the hospital involved is even a bigger concern, as patients have a strong sense of distrust. Therefore, in the operation of technology-based service encounters, along with providing network security, it is essential to build an atmosphere of psychological trust.

  2. Social networks and mental health among people living with human immunodeficiency virus (HIV) in Johannesburg, South Africa.

    PubMed

    Odek, Willis Omondi

    2014-01-01

    People living with human immunodeficiency virus (PLHIV) in developing countries can live longer due to improved treatment access, and a deeper understanding of determinants of their quality of life is critical. This study assessed the link between social capital, operationally defined in terms of social networks (group-based and personal social networks) and access to network resources (access to material and non-material resources and social support) and health-related quality of life (HRQoL) among 554 (55% female) adults on HIV treatment through South Africa's public health system. Female study participants were involved with more group-based social networks but had fewer personal social networks in comparison to males. Access to network resources was higher among females and those from larger households but lower among older study participants. Experience of social support significantly increased with household economic status and duration at current residence. Social capital indicators were unrelated to HIV disease status indicators, including duration since diagnosis, CD4 count and viral load. Only a minority (13%) of study participants took part in groups formed by and for predominantly PLHIV (HIV support groups), and participation in such groups was unrelated to their mental or physical health. Personal rather than group-linked social networks and access to network resources were significantly associated with mental but not physical health, after controlling for sociodemographic characteristics. The findings of limited participation in HIV support groups and that the participation in such groups was not significantly associated with physical or mental health may suggest efforts among PLHIV in South Africa to normalise HIV as a chronic illness through broad-based rather than HIV-status bounded social participation, as a strategy for deflecting stigma. Further research is required to examine the effects of HIV treatment on social networking and participation among PLHIV within both rural and other urban settings of South Africa.

  3. Networking of Icelandic Earth Infrastructures - Natural laboratories and Volcano Supersites

    NASA Astrophysics Data System (ADS)

    Vogfjörd, K. S.; Sigmundsson, F.; Hjaltadóttir, S.; Björnsson, H.; Arason, Ø.; Hreinsdóttir, S.; Kjartansson, E.; Sigbjörnsson, R.; Halldórsson, B.; Valsson, G.

    2012-04-01

    The back-bone of Icelandic geoscientific research infrastructure is the country's permanent monitoring networks, which have been built up to monitor seismic and volcanic hazard and deformation of the Earth's surface. The networks are mainly focussed around the plate boundary in Iceland, particularly the two seismic zones, where earthquakes of up to M7.3 have occurred in centuries past, and the rift zones with over 30 active volcanic systems where a large number of powerful eruptions have occurred, including highly explosive ones. The main observational systems are seismic, strong motion, GPS and bore-hole strain networks, with the addition of more recent systems like hydrological stations, permanent and portable radars, ash-particle counters and gas monitoring systems. Most of the networks are owned by a handful of Icelandic institutions, but some are operated in collaboration with international institutions and universities. The networks have been in operation for years to decades and have recorded large volumes of research quality data. The main Icelandic infrastructures will be networked in the European Plate Observing System (EPOS). The plate boundary in the South Iceland seismic zone (SISZ) with its book-shelf tectonics and repeating major earthquakes sequences of up to M7 events, has the potential to be defined a natural laboratory within EPOS. Work towards integrating multidisciplinary data and technologies from the monitoring infrastructures in the SISZ with other fault regions has started in the FP7 project NERA, under the heading of Networking of Near-Fault Observatories. The purpose is to make research-quality data from near-fault observatories available to the research community, as well as to promote transfer of knowledge and techical know-how between the different observatories of Europe, in order to create a network of fault-monitoring networks. The seismic and strong-motion systems in the SISZ are also, to some degree, being networked nationally to strengthen their early warning capabilities. In response to the far-reaching dispersion of ash from the 2010 Eyjafjallajökull eruption and subsequent disturbance to European air-space, the instrumentation of the Icelandic volcano observatory was greatly improved in number and capability to better monitor sub-surface volcanic processes as well as the air-borne products of eruptions. This infrastructure will also be networked with other European volcano observatories in EPOS. Finally the Icelandic EPOS team, together with other European collaborators, has responded to an FP7 call for the establishment of an Icelandic volcano supersite, where land- and space-based data will be made available to researchers and hazard managers, in line with the implementation plan of the GEO. The focus of the Icelandic volcano supersite are the active volcanoes in Iceland's Eastern volcanic zone.

  4. Groundwater-quality data from the National Water-Quality Assessment Project, January through December 2014 and select quality-control data from May 2012 through December 2014

    USGS Publications Warehouse

    Arnold, Terri L.; Bexfield, Laura M.; Musgrove, MaryLynn; Lindsey, Bruce D.; Stackelberg, Paul E.; Barlow, Jeannie R.; Desimone, Leslie A.; Kulongoski, Justin T.; Kingsbury, James A.; Ayotte, Joseph D.; Fleming, Brandon J.; Belitz, Kenneth

    2017-10-05

    Groundwater-quality data were collected from 559 wells as part of the National Water-Quality Assessment Project of the U.S. Geological Survey National Water-Quality Program from January through December 2014. The data were collected from four types of well networks: principal aquifer study networks, which are used to assess the quality of groundwater used for public water supply; land-use study networks, which are used to assess land-use effects on shallow groundwater quality; major aquifer study networks, which are used to assess the quality of groundwater used for domestic supply; and enhanced trends networks, which are used to evaluate the time scales during which groundwater quality changes. Groundwater samples were analyzed for a large number of water-quality indicators and constituents, including major ions, nutrients, trace elements, volatile organic compounds, pesticides, radionuclides, and some constituents of special interest (arsenic speciation, chromium [VI] and perchlorate). These groundwater-quality data, along with data from quality-control samples, are tabulated in this report and in an associated data release.

  5. External quality assurance project report for the National Atmospheric Deposition Program’s National Trends Network and Mercury Deposition Network, 2015–16

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Martin, RoseAnn

    2018-06-29

    The U.S. Geological Survey Precipitation Chemistry Quality Assurance project operated five distinct programs to provide external quality assurance monitoring for the National Atmospheric Deposition Program’s (NADP) National Trends Network and Mercury Deposition Network during 2015–16. The National Trends Network programs include (1) a field audit program to evaluate sample contamination and stability, (2) an interlaboratory comparison program to evaluate analytical laboratory performance, and (3) a colocated sampler program to evaluate bias and variability attributed to automated precipitation samplers. The Mercury Deposition Network programs include the (4) system blank program and (5) an interlaboratory comparison program. The results indicate that NADP data continue to be of sufficient quality for the analysis of spatial distributions and time trends for chemical constituents in wet deposition.The field audit program results indicate increased sample contamination for calcium, magnesium, and potassium relative to 2010 levels, and slight fluctuation in sodium contamination. Nitrate contamination levels dropped slightly during 2014–16, and chloride contamination leveled off between 2007 and 2016. Sulfate contamination is similar to the 2000 level. Hydrogen ion contamination has steadily decreased since 2012. Losses of ammonium and nitrate resulting from potential sample instability were negligible.The NADP Central Analytical Laboratory produced interlaboratory comparison results with low bias and variability compared to other domestic and international laboratories that support atmospheric deposition monitoring. Significant absolute bias above the magnitudes of the detection limits was observed for nitrate and sulfate concentrations, but no analyte determinations exceeded the detection limits for blanks.Colocated sampler program results from dissimilar colocated collectors indicate that the retrofit of the National Trends Network with N-CON Systems Company, Inc. precipitation collectors could cause substantial shifts in NADP annual deposition (concentration multiplied by depth) values. Median weekly relative percent differences for analyte concentrations ranged from -4 to +76 percent for cations, from 5 to 6 percent for ammonium, from +14 to +25 percent for anions, and from -21 to +8 percent for hydrogen ion contamination. By comparison, weekly absolute concentration differences for paired identical N-CON Systems Company, Inc., collectors ranged from 4–22 percent for cations; 2–9 percent for anions; 4–5 percent for ammonium; and 13–14 percent for hydrogen ion contamination. The N-CON Systems Company, Inc. collector caught more precipitation than the Aerochem Metrics Model 301 collector (ACM) at the WA99/99WA sites, but it typically caught slightly less precipitation than the ACM at ND11/11ND, sites which receive more wind and snow than WA99/99WA.Paired, identical OTT Pluvio-2 and ETI Noah IV precipitation gages were operated at the same sites. Median absolute percent differences for daily measured precipitation depths ranged from 0 to 7 percent. Annual absolute differences ranged from 0.08 percent (ETI Noah IV precipitation gages) to 11 percent (OTT Pluvio-2 precipitation gages).The Mercury Deposition Network programs include the system blank program and an interlaboratory comparison program. System blank results indicate that maximum total mercury contamination concentrations in samples were less than the third percentile of all Mercury Deposition Network sample concentrations (1.098 nanograms per liter; ng/L). The Mercury Analytical Laboratory produced chemical concentration results with low bias and variability compared with other domestic and international laboratories that support atmospheric-deposition monitoring. The laboratory’s performance results indicate a +1-ng/L shift in bias between 2015 (-0.4 ng/L) and 2016 (+0.5 ng/L).

  6. Investigation of environmental indices from the Earth Resources Technology Satellite

    NASA Technical Reports Server (NTRS)

    Greeley, R. S. (Principal Investigator); Riley, E. L.; Stryker, S.; Ward, E. A.

    1973-01-01

    The author has identified the following significant results. Land use, quality, and air quality trends are being deduced from both ERTS-1 MSS and computer compatible tapes. The data analysis plan and the preliminary data analysis phase were conducted in January 1973. Results from these two phases are: (1) Method of analysis has been selected and checked out. (2) Land use for two dates have been generated for one test site. (3) Water quality for one date has been produced partially. (4) Air quality for three has been produced and compared with ground truth. (5) One of the two DCP stations is in operation; the second station will be installed in March 1973. Land use classification exceeds pre-launch expectations. Water quality (turbidity) is not progressing as expected. Finally, mesoscale air quality results have shown correlation with NOAA/EPA turbidity network. If air quality correlations continue to show favorable results, a rapid means of global turbidity may be available from ERTS-1 MSS observations.

  7. Design of a ground-water-quality monitoring network for the Salinas River basin, California

    USGS Publications Warehouse

    Showalter, P.K.; Akers, J.P.; Swain, L.A.

    1984-01-01

    A regional ground-water quality monitoring network for the entire Salinas River drainage basin was designed to meet the needs of the California State Water Resources Control Board. The project included phase 1--identifying monitoring networks that exist in the region; phase 2--collecting information about the wells in each network; and phase 3--studying the factors--such as geology, land use, hydrology, and geohydrology--that influence the ground-water quality, and designing a regional network. This report is the major product of phase 3. Based on the authors ' understanding of the ground-water-quality monitoring system and input from local offices, an ideal network was designed. The proposed network includes 317 wells and 8 stream-gaging stations. Because limited funds are available to implement the monitoring network, the proposed network is designed to correspond to the ideal network insofar as practicable, and is composed mainly of 214 wells that are already being monitored by a local agency. In areas where network wells are not available, arrangements will be made to add wells to local networks. The data collected by this network will be used to assess the ground-water quality of the entire Salinas River drainage basin. After 2 years of data are collected, the network will be evaluated to test whether it is meeting the network objectives. Subsequent network evaluations will be done very 5 years. (USGS)

  8. Coexistence issues for a 2.4 GHz wireless audio streaming in presence of bluetooth paging and WLAN

    NASA Astrophysics Data System (ADS)

    Pfeiffer, F.; Rashwan, M.; Biebl, E.; Napholz, B.

    2015-11-01

    Nowadays, customers expect to integrate their mobile electronic devices (smartphones and laptops) in a vehicle to form a wireless network. Typically, IEEE 802.11 is used to provide a high-speed wireless local area network (WLAN) and Bluetooth is used for cable replacement applications in a wireless personal area network (PAN). In addition, Daimler uses KLEER as third wireless technology in the unlicensed (UL) 2.4 GHz-ISM-band to transmit full CD-quality digital audio. As Bluetooth, IEEE 802.11 and KLEER are operating in the same frequency band, it has to be ensured that all three technologies can be used simultaneously without interference. In this paper, we focus on the impact of Bluetooth and IEEE 802.11 as interferer in presence of a KLEER audio transmission.

  9. Real-time transmission of full-motion echocardiography over a high-speed data network: impact of data rate and network quality of service.

    PubMed

    Main, M L; Foltz, D; Firstenberg, M S; Bobinsky, E; Bailey, D; Frantz, B; Pleva, D; Baldizzi, M; Meyers, D P; Jones, K; Spence, M C; Freeman, K; Morehead, A; Thomas, J D

    2000-08-01

    With high-resolution network transmission required for telemedicine, education, and guided-image acquisition, the impact of errors and transmission rates on image quality needs evaluation. We transmitted clinical echocardiograms from 2 National Aeronautics and Space Administration (NASA) research centers with the use of Motion Picture Expert Group-2 (MPEG-2) encoding and asynchronous transmission mode (ATM) network protocol over the NASA Research and Education Network. Data rates and network quality (cell losses [CLR], errors [CER], and delay variability [CVD]) were altered and image quality was judged. At speeds of 3 to 5 megabits per second (Mbps), digital images were superior to those on videotape; at 2 Mbps, images were equivalent. Increasing CLR caused occasional, brief pauses. Extreme CER and CDV increases still yielded high-quality images. Real-time echocardiographic acquisition, guidance, and transmission is feasible with the use of MPEG-2 and ATM with broadcast quality seen above 3 Mbps, even with severe network quality degradation. These techniques can be applied to telemedicine and used for planned echocardiography aboard the International Space Station.

  10. Real-time transmission of full-motion echocardiography over a high-speed data network: impact of data rate and network quality of service

    NASA Technical Reports Server (NTRS)

    Main, M. L.; Foltz, D.; Firstenberg, M. S.; Bobinsky, E.; Bailey, D.; Frantz, B.; Pleva, D.; Baldizzi, M.; Meyers, D. P.; Jones, K.; hide

    2000-01-01

    With high-resolution network transmission required for telemedicine, education, and guided-image acquisition, the impact of errors and transmission rates on image quality needs evaluation. METHODS: We transmitted clinical echocardiograms from 2 National Aeronautics and Space Administration (NASA) research centers with the use of Motion Picture Expert Group-2 (MPEG-2) encoding and asynchronous transmission mode (ATM) network protocol over the NASA Research and Education Network. Data rates and network quality (cell losses [CLR], errors [CER], and delay variability [CVD]) were altered and image quality was judged. RESULTS: At speeds of 3 to 5 megabits per second (Mbps), digital images were superior to those on videotape; at 2 Mbps, images were equivalent. Increasing CLR caused occasional, brief pauses. Extreme CER and CDV increases still yielded high-quality images. CONCLUSIONS: Real-time echocardiographic acquisition, guidance, and transmission is feasible with the use of MPEG-2 and ATM with broadcast quality seen above 3 Mbps, even with severe network quality degradation. These techniques can be applied to telemedicine and used for planned echocardiography aboard the International Space Station.

  11. Innovation in Indigenous Health and Medical Education: The Leaders in Indigenous Medical Education (LIME) Network as a Community of Practice.

    PubMed

    Mazel, Odette; Ewen, Shaun

    2015-01-01

    The Leaders in Indigenous Medical Education (LIME) Network aims to improve the quality and effectiveness of Indigenous health in medical education as well as best practice in the recruitment, retention, and graduation of Indigenous medical students. In this article we explore the utility of Etienne Wenger's "communities of practice" (CoP) concept in providing a theoretical framework to better understand the LIME Network as a form of social infrastructure to further knowledge and innovation in this important area of health care education reform. The Network operates across all medical schools in Australia and New Zealand. Utilizing a model of evaluation of communities of practice developed by Fung-Kee-Fung et al., we seek to analyze the outcomes of the LIME Network as a CoP and assess its approach and contribution to improving the implementation of Indigenous health in the medical curriculum and the graduation of Indigenous medical students. By reflecting on the Network through a community of practice lens, this article highlights the synthesis between the LIME Network and Wenger's theory and provides a framework with which to measure Network outputs. It also posits an opportunity to better capture the impact of Network activities into the future to ensure that it remains a relevant and sustainable entity.

  12. Data Quality Assessment Methods for the Eastern Range 915 MHz Wind Profiler Network

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.; Taylor, Gregory E.

    1998-01-01

    The Eastern Range installed a network of five 915 MHz Doppler Radar Wind Profilers with Radio Acoustic Sounding Systems in the Cape Canaveral Air Station/Kennedy Space Center area to provide three-dimensional wind speed and direction and virtual temperature estimates in the boundary layer. The Applied Meteorology Unit, staffed by ENSCO, Inc., was tasked by the 45th Weather Squadron, the Spaceflight Meteorology Group, and the National Weather Service in Melbourne, Florida to investigate methods which will help forecasters assess profiler network data quality when developing forecasts and warnings for critical ground, launch and landing operations. Four routines were evaluated in this study: a consensus time period check a precipitation contamination check, a median filter, and the Weber-Wuertz (WW) algorithm. No routine was able to effectively flag suspect data when used by itself. Therefore, the routines were used in different combinations. An evaluation of all possible combinations revealed two that provided the best results. The precipitation contamination and consensus time routines were used in both combinations. The median filter or WW was used as the final routine in the combinations to flag all other suspect data points.

  13. INCREASE: Innovation and Networking for the integration of Coastal Radars into European mArine SErvices

    NASA Astrophysics Data System (ADS)

    Mader, Julien; Rubio, Anna; Asensio Igoa, Jose Luis; Corgnati, Lorenzo; Mantovani, Carlo; Griffa, Annalisa; Gorringe, Patrick; Alba, Marco; Novellino, Antonio

    2017-04-01

    High Frequency radar (HFR) is a land-based remote sensing instrument offering a unique insight to coastal ocean variability, by providing synoptic, high frequency and high resolution data at the ocean atmosphere interface. HFRs have become invaluable tools in the field of operational oceanography for measuring surface currents, waves and winds, with direct applications in different sectors and an unprecedented potential for the integrated management of the coastal zone. To further the use of HFRs into the Copernicus Marine environment monitoring service, CMEMS, is becoming crucial to ensure the improved management of several related key issues such as Marine Safety, Marine Resources, Coastal & Marine Environment, Weather, Climate & Seasonal Forecast. In this context, INCREASE (Innovation and Networking for the integration of Coastal Radars into European mArine SErvices) project aims to set the necessary developments towards the integration of the existing European HFR operational systems into the CMEMS, following five main objectives: (i) Define and implement a common data and metadata model for HFR real-time data; (ii) Provide HFR quality controlled real-time surface currents and key derived products; (iii) Set the basis for the management of historical data and methodologies for advanced delayed mode quality-control techniques; (iv) Advance the use of HFR data for improving CMEMS numerical modelling systems; and (v) Enable an HFR European operational node to ensure the link with operational CMEMS. In cooperation with other ongoing initiatives (like the EuroGOOS HFR Task Team and the European project JERICO_NEXT), INCREASE has already set up the data management infrastructure to manage and make discoverable and accessible near real time data from 30 systems in Europe. This paper presents the achieved results and available products and features.

  14. Spacelab data processing facility (SLDPF) quality assurance (QA)/data accounting (DA) expert systems - Transition from prototypes to operational systems

    NASA Technical Reports Server (NTRS)

    Basile, Lisa

    1988-01-01

    The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historical records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.

  15. Spacelab data processing facility (SLDPF) Quality Assurance (QA)/Data Accounting (DA) expert systems: Transition from prototypes to operational systems

    NASA Technical Reports Server (NTRS)

    Basile, Lisa

    1988-01-01

    The SLDPF is responsible for the capture, quality monitoring processing, accounting, and shipment of Spacelab and/or Attached Shuttle Payloads (ASP) telemetry data to various user facilities. Expert systems will aid in the performance of the quality assurance and data accounting functions of the two SLDPF functional elements: the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). Prototypes were developed for each as independent efforts. The SIPS Knowledge System Prototype (KSP) used the commercial shell OPS5+ on an IBM PC/AT; the SOPS Expert System Prototype used the expert system shell CLIPS implemented on a Macintosh personal computer. Both prototypes emulate the duties of the respective QA/DA analysts based upon analyst input and predetermined mission criteria parameters, and recommended instructions and decisions governing the reprocessing, release, or holding for further analysis of data. These prototypes demonstrated feasibility and high potential for operational systems. Increase in productivity, decrease of tedium, consistency, concise historial records, and a training tool for new analyses were the principal advantages. An operational configuration, taking advantage of the SLDPF network capabilities, is under development with the expert systems being installed on SUN workstations. This new configuration in conjunction with the potential of the expert systems will enhance the efficiency, in both time and quality, of the SLDPF's release of Spacelab/AST data products.

  16. Geohydrology of the Antelope Valley Area, California and design for a ground-water-quality monitoring network

    USGS Publications Warehouse

    Duell, L.F.

    1987-01-01

    A basinwide ideal network and an actual network were designed to identify ambient groundwater quality, trends in groundwater quality, and degree of threat from potential pollution sources in Antelope Valley, California. In general, throughout the valley groundwater quality has remained unchanged, and no specific trends are apparent. The main source of groundwater for the valley is generally suitable for domestic, irrigation, and most industrial uses. Water quality data for selected constituents of some network wells and surface-water sites are presented. The ideal network of 77 sites was selected on the basis of site-specific criteria, geohydrology, and current land use (agricultural, residential, and industrial). These sites were used as a guide in the design of the actual network consisting of 44 existing wells. Wells are currently being monitored and were selected whenever possible because of budgetary constraints. Of the remaining ideal sites, 20 have existing wells not part of a current water quality network, and 13 are locations where no wells exist. The methodology used for the selection of sites, constituents monitored, and frequency of analysis will enable network users to make appropriate future changes to the monitoring network. (USGS)

  17. Implementation of Cyber-Physical Production Systems for Quality Prediction and Operation Control in Metal Casting

    PubMed Central

    Lee, JuneHyuck; Noh, Sang Do; Kim, Hyun-Jung; Kang, Yong-Shin

    2018-01-01

    The prediction of internal defects of metal casting immediately after the casting process saves unnecessary time and money by reducing the amount of inputs into the next stage, such as the machining process, and enables flexible scheduling. Cyber-physical production systems (CPPS) perfectly fulfill the aforementioned requirements. This study deals with the implementation of CPPS in a real factory to predict the quality of metal casting and operation control. First, a CPPS architecture framework for quality prediction and operation control in metal-casting production was designed. The framework describes collaboration among internet of things (IoT), artificial intelligence, simulations, manufacturing execution systems, and advanced planning and scheduling systems. Subsequently, the implementation of the CPPS in actual plants is described. Temperature is a major factor that affects casting quality, and thus, temperature sensors and IoT communication devices were attached to casting machines. The well-known NoSQL database, HBase and the high-speed processing/analysis tool, Spark, are used for IoT repository and data pre-processing, respectively. Many machine learning algorithms such as decision tree, random forest, artificial neural network, and support vector machine were used for quality prediction and compared with R software. Finally, the operation of the entire system is demonstrated through a CPPS dashboard. In an era in which most CPPS-related studies are conducted on high-level abstract models, this study describes more specific architectural frameworks, use cases, usable software, and analytical methodologies. In addition, this study verifies the usefulness of CPPS by estimating quantitative effects. This is expected to contribute to the proliferation of CPPS in the industry. PMID:29734699

  18. Social cognitive theory: an agentic perspective.

    PubMed

    Bandura, A

    2001-01-01

    The capacity to exercise control over the nature and quality of one's life is the essence of humanness. Human agency is characterized by a number of core features that operate through phenomenal and functional consciousness. These include the temporal extension of agency through intentionality and forethought, self-regulation by self-reactive influence, and self-reflectiveness about one's capabilities, quality of functioning, and the meaning and purpose of one's life pursuits. Personal agency operates within a broad network of sociostructural influences. In these agentic transactions, people are producers as well as products of social systems. Social cognitive theory distinguishes among three modes of agency: direct personal agency, proxy agency that relies on others to act on one's behest to secure desired outcomes, and collective agency exercised through socially coordinative and interdependent effort. Growing transnational embeddedness and interdependence are placing a premium on collective efficacy to exercise control over personal destinies and national life.

  19. PEP725 Pan European Phenological Database

    NASA Astrophysics Data System (ADS)

    Koch, E.; Lipa, W.; Ungersböck, M.; Zach-Hermann, S.

    2012-04-01

    PEP725 is a 5 years project with the main object to promote and facilitate phenological research by delivering a pan European phenological database with an open, unrestricted data access for science, research and education. PEP725 is funded by EUMETNET (the network of European meteorological services), ZAMG and the Austrian ministry for science & research bm:w_f. So far 16 European national meteorological services and 7 partners from different nati-onal phenological network operators have joined PEP725. The data access is very easy via web-access from the homepage www.pep725.eu. Ha-ving accepted the PEP725 data policy and registry the data download can be done by different criteria as for instance the selection of a specific plant or all data from one country. At present more than 300 000 new records are available in the PEP725 data-base coming from 31 European countries and from 8150 stations. For some more sta-tions (154) META data (location and data holder) are provided. Links to the network operators and data owners are also on the webpage in case you have more sophisticated questions about the data. Another objective of PEP725 is to bring together network-operators and scientists by organizing workshops. In April 2012 the second of these workshops will take place on the premises of ZAMG. Invited speakers will give presentations spanning the whole study area of phenology starting from observations to modelling. Quality checking is also a big issue. At the moment we study the literature to find ap-propriate methods.

  20. Social Support Networks and Quality of Life of Rural Men in a Context of Marriage Squeeze in China.

    PubMed

    Wang, Sasa; Yang, Xueyan; Attané, Isabelle

    2018-07-01

    A significant number of rural Chinese men are facing difficulties in finding a spouse and may fail to ever marry due to a relative scarcity of women in the adult population. Research has indicated that marriage squeeze is a stressful event which is harmful to men's quality of life, and also weakens their social support networks. Using data collected in rural Chaohu city, Anhui, China, this study explores the effects of social support networks on quality of life of rural men who experience a marriage squeeze. The results indicate that the size of social contact networks is directly and positively associated with the quality of life of marriage-squeezed men, and moderate the negative effect of age on quality of life. Having no or limited instrumental support network and social contact network are double-edged swords, which have direct negative associations with the quality of life of marriage-squeezed men, and have moderate effects on the relationship between marriage squeeze and quality of life.

  1. Increasing research capacity and changing the culture of primary care towards reflective inquiring practice: the experience of the West London Research Network (WeLReN).

    PubMed

    Thomas, P; While, A

    2001-05-01

    A number of primary care research networks were set up throughout England in 1998 in order to (1) improve the quality of primary care research (2) increase the research capacity of primary care, and (3) change the culture of primary care towards reflective inquiring practice (NHSE, 2000b). It is not clear how best to operate a network to achieve these diverse aims. This paper describes the first 30 months of a network that adopted a whole system approach in the belief that this would offer the best chance of simultaneously achieving the three aims. A cycle of activity was designed to facilitate the formation of multidisciplinary coalitions of interest for research with complementary 'top down' and 'bottom up' programmes of work co-existing. At least 330 people participated in the generation of research questions of whom one third (33%) were general practitioners, 16% community nurses, 6% practice managers and other primary care practitioners. Over two fifths (43%) were 'key allies'--academics, health authority staff, community workers and project workers. One fifth (110) of all practices (500) in the WeLReN area have collaborated in at least one research project. The ratio of doctor:nurse participation in the 24 research project teams was markedly different in the supported coalitions (2:1) compared to projects devised and led by more experienced researchers (6:1). The evidence suggests that it is possible to operate a primary care research network in a way that develops coalitions of interest from different parts of the health care system as well as both 'top down' and 'bottom up' led projects. It is too early to tell if the approach will be able to achieve its aims in the long-term but the activity data are encouraging. There is a need for more research on the theoretical basis of network operation.

  2. RESIF national datacentre : new features and forthcoming evolutions

    NASA Astrophysics Data System (ADS)

    Pequegnat, C.; Volcke, P.; le Tanou, J.; Wolyniec, D.; Lecointre, A.; Guéguen, P.

    2013-12-01

    RESIF is a nationwide french project aimed at building an high quality system to observe and understand the inner earth. The goal is to create a network throughout mainland France comprising 750 seismometers and geodetic measurement instruments, 250 of which will be mobile, to enable the observation network to be focussed on specific investigation subjects and geographic locations. The RESIF data distribution centre, which is a part of the global project, is operated by the Université Joseph Fourier (Grenoble, France) and is being implemented for two years. Data from french broadband permanent network, strong motion permanent network, and mobile seismological antenna are freely accessible as realtime streams and continuous validated data, along with instrumental metadata, delivered using widely known formats and requests tools. New features of the datacentre are : - new modern distribution tools : two FDSN WEBservices has been implemented and deliver data and metadata. - new data and datasets : the number of permanent stations rose by over 40 % percent in one year and the RESIF archive now includes past data (down to 1995) and data from new networks. Moreover, data from mobile experiments prior to 2011 is progressively released, and data from new mobile experiments in the Alps and in the Pyrenean mountains is progressively integrated. - new infrastructures : (1) the RESIF databank is about to be connected to the grid storage of the University High Performance Computing (HPC) centre. As a scientific use case of this datacenter facility, a focus is made on intensive exploitation of combined data from permanent and mobile networks (2) the RESIF databank will be progressively hosted on a new shared storage facility operated by the Université Joseph Fourier. This infrastructure offers high availability data storage (both in blocks and files modes) as well as backup and long term archival capabilities, and will be fully operational at the beginning of 2014.

  3. AllAboard: Visual Exploration of Cellphone Mobility Data to Optimise Public Transport.

    PubMed

    Di Lorenzo, G; Sbodio, M; Calabrese, F; Berlingerio, M; Pinelli, F; Nair, R

    2016-02-01

    The deep penetration of mobile phones offers cities the ability to opportunistically monitor citizens' mobility and use data-driven insights to better plan and manage services. With large scale data on mobility patterns, operators can move away from the costly, mostly survey based, transportation planning processes, to a more data-centric view, that places the instrumented user at the center of development. In this framework, using mobile phone data to perform transit analysis and optimization represents a new frontier with significant societal impact, especially in developing countries. In this paper we present AllAboard, an intelligent tool that analyses cellphone data to help city authorities in visually exploring urban mobility and optimizing public transport. This is performed within a self contained tool, as opposed to the current solutions which rely on a combination of several distinct tools for analysis, reporting, optimisation and planning. An interactive user interface allows transit operators to visually explore the travel demand in both space and time, correlate it with the transit network, and evaluate the quality of service that a transit network provides to the citizens at very fine grain. Operators can visually test scenarios for transit network improvements, and compare the expected impact on the travellers' experience. The system has been tested using real telecommunication data for the city of Abidjan, Ivory Coast, and evaluated from a data mining, optimisation and user prospective.

  4. Experiences with a Decade of Wireless Sensor Networks in Mountain Cryosphere Research

    NASA Astrophysics Data System (ADS)

    Beutel, Jan

    2017-04-01

    Research in geoscience depends on high-quality measurements over long periods of time in order to understand processes and to create and validate models. The promise of wireless sensor networks to monitor autonomously at unprecedented spatial and temporal scale motivated the use of this novel technology for studying mountain permafrost in the mid 2000s. Starting from a first experimental deployment to investigate the thermal properties of steep bedrock permafrost in 2006 on the Jungfraujoch, Switzerland at 3500 m asl using prototype wireless sensors the PermaSense project has evolved into a multi-site and multi-discipline initiative. We develop, deploy and operate wireless sensing systems customized for long-term autonomous operation in high-mountain environments. Around this central element, we develop concepts, methods and tools to investigate and to quantify the connection between climate, cryosphere (permafrost, glaciers, snow) and geomorphodynamics. In this presentation, we describe the concepts and system architecture used both for the wireless sensor network as well as for data management and processing. Furthermore, we will discuss the experience gained in over a decade of planning, installing and operating large deployments on field sites spread across a large part of the Swiss and French Alps and applications ranging from academic, experimental research campaigns, long-term monitoring and natural hazard warning in collaboration with government authorities and local industry partners. Reference http://www.permasense.ch Online Open Data Access http://data.permasense.ch

  5. Challenges for Wireless Mesh Networks to provide reliable carrier-grade services

    NASA Astrophysics Data System (ADS)

    von Hugo, D.; Bayer, N.

    2011-08-01

    Provision of mobile and wireless services today within a competitive environment and driven by a huge amount of steadily emerging new services and applications is both challenge and chance for radio network operators. Deployment and operation of an infrastructure for mobile and wireless broadband connectivity generally requires planning effort and large investments. A promising approach to reduce expenses for radio access networking is offered by Wireless Mesh Networks (WMNs). Here traditional dedicated backhaul connections to each access point are replaced by wireless multi-hop links between neighbouring access nodes and few gateways to the backbone employing standard radio technology. Such a solution provides at the same time high flexibility in both deployment and the amount of offered capacity and shall reduce overall expenses. On the other hand currently available mesh solutions do not provide carrier grade service quality and reliability and often fail to cope with high traffic load. EU project CARMEN (CARrier grade MEsh Networks) was initiated to incorporate different heterogeneous technologies and new protocols to allow for reliable transmission over "best effort" radio channels, to support a reliable mobility and network management, self-configuration and dynamic resource usage, and thus to offer a permanent or temporary broadband access at high cost efficiency. The contribution provides an overview on preliminary project results with focus on main technical challenges from a research and implementation point of view. Especially impact of mesh topology on the overall system performance in terms of throughput and connection reliability and aspects of a dedicated hybrid mobility management solution will be discussed.

  6. NSI operations center

    NASA Technical Reports Server (NTRS)

    Zanley, Nancy L.

    1991-01-01

    The NASA Science Internet (NSI) Network Operations Staff is responsible for providing reliable communication connectivity for the NASA science community. As the NSI user community expands, so does the demand for greater interoperability with users and resources on other networks (e.g., NSFnet, ESnet), both nationally and internationally. Coupled with the science community's demand for greater access to other resources is the demand for more reliable communication connectivity. Recognizing this, the NASA Science Internet Project Office (NSIPO) expands its Operations activities. By January 1990, Network Operations was equipped with a telephone hotline, and its staff was expanded to six Network Operations Analysts. These six analysts provide 24-hour-a-day, 7-day-a-week coverage to assist site managers with problem determination and resolution. The NSI Operations staff monitors network circuits and their associated routers. In most instances, NSI Operations diagnoses and reports problems before users realize a problem exists. Monitoring of the NSI TCP/IP Network is currently being done with Proteon's Overview monitoring system. The Overview monitoring system displays a map of the NSI network utilizing various colors to indicate the conditions of the components being monitored. Each node or site is polled via the Simple Network Monitoring Protocol (SNMP). If a circuit goes down, Overview alerts the Network Operations staff with an audible alarm and changes the color of the component. When an alert is received, Network Operations personnel immediately verify and diagnose the problem, coordinate repair with other networking service groups, track problems, and document problem and resolution into a trouble ticket data base. NSI Operations offers the NSI science community reliable connectivity by exercising prompt assessment and resolution of network problems.

  7. Optimal Stabilization of Social Welfare under Small Variation of Operating Condition with Bifurcation Analysis

    NASA Astrophysics Data System (ADS)

    Chanda, Sandip; De, Abhinandan

    2016-12-01

    A social welfare optimization technique has been proposed in this paper with a developed state space based model and bifurcation analysis to offer substantial stability margin even in most inadvertent states of power system networks. The restoration of the power market dynamic price equilibrium has been negotiated in this paper, by forming Jacobian of the sensitivity matrix to regulate the state variables for the standardization of the quality of solution in worst possible contingencies of the network and even with co-option of intermittent renewable energy sources. The model has been tested in IEEE 30 bus system and illustrious particle swarm optimization has assisted the fusion of the proposed model and methodology.

  8. International GPS Service for Geodynamics

    NASA Technical Reports Server (NTRS)

    Zumberge, J. F. (Editor); Urban, M. P. (Editor); Liu, R. (Editor); Neilan, R. E. (Editor)

    1996-01-01

    This 1995 annual report of the IGS International GPS (Global Positioning System) Service for Geodynamics - describes the second operational year of the service. It provides the many IGS contributing agencies and the rapidly growing user community with essential information on current organizational and technical matters promoting the IGS standards and products (including organizational framework, data processing strategies, and statistics showing the remarkable expansion of the GPS monitoring network, the improvement of IGS performance, and product quality). It also introduces important practical concepts for network densification by integration of regional stations and the combination of station coordinate solutions. There are groups of articles describing general aspects of the IGS, the Associate Analysis Centers (AACs), Data Centers, and IGS stations.

  9. Environments for online maritime simulators with cloud computing capabilities

    NASA Astrophysics Data System (ADS)

    Raicu, Gabriel; Raicu, Alexandra

    2016-12-01

    This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.

  10. Research on the performance evaluation of agricultural products supply chain integrated operation

    NASA Astrophysics Data System (ADS)

    Jiang, Jiake; Wang, Xifu; Liu, Yang

    2017-04-01

    The agricultural product supply chain integrated operation can ensure the quality and efficiency of agricultural products, and achieve the optimal goal of low cost and high service. This paper establishes a performance evaluation index system of agricultural products supply chain integration operation based on the development status of agricultural products and SCOR, BSC and KPI model. And then, we constructing rough set theory and BP neural network comprehensive evaluation model with the aid of Rosetta and MATLAB tools and the case study is about the development of agricultural products integrated supply chain in Jing-Jin-Ji region. And finally, we obtain the corresponding performance results, and give some improvement measures and management recommendations to the managers.

  11. Mine Winder Drives in Integrated Copper Complex

    NASA Astrophysics Data System (ADS)

    Dey, Pranab Kumar

    2018-04-01

    This paper describes various features required to be evaluated before selecting mine winder drives. In handling such project, the selection of proper equipments is necessary at the initial design stage of planning and how the electrical system design considers all aspects to protect the grid from unwarranted influence of the connected loads and minimize the generation of harmonics due to network configurations adopted to keep it within the stipulated value dictated by the supply authorities has been discussed. The design should cover all aspects to provide quality power with effective braking system required as per the mining statute for operational safety. It also emphasizes on the requirement of quality maintenance.

  12. Globalizing Lessons Learned from Regional-scale Observatories

    NASA Astrophysics Data System (ADS)

    Glenn, S. M.

    2016-02-01

    The Mid Atlantic Regional Association Coastal Ocean Observing System (MARACOOS) has accumulated a decade of experience designing, building and operating a Regional Coastal Ocean Observing System for the U.S. Integrated Ocean Observing System (IOOS). MARACOOS serves societal goals and supports scientific discovery at the scale of a Large Marine Ecosystem (LME). Societal themes include maritime safety, ecosystem decision support, coastal inundation, water quality and offshore energy. Scientific results that feed back on societal goals with better products include improved understanding of seasonal transport pathways and their impact on phytoplankton blooms and hypoxia, seasonal evolution of the subsurface Mid Atlantic Cold Pool and its impact on fisheries, biogeochemical transformations in coastal plumes, coastal ocean evolution and impact on hurricane intensities, and storm sediment transport pathways. As the global ocean observing requirements grow to support additional societal needs for information on fisheries and aquaculture, ocean acidification and deoxygenation, water quality and offshore development, global observing will necessarily evolve to include more coastal observations and forecast models at the scale of the world's many LMEs. Here we describe our efforts to share lessons learned between the observatory operators at the regional-scale of the LMEs. Current collaborators are spread across Europe, and also include Korea, Indonesia, Australia, Brazil and South Africa. Specific examples include the development of a world standard QA/QC approach for HF Radar data that will foster the sharing of data between countries, basin-scale underwater glider missions between internationally-distributed glider ports to developed a shared understanding of operations and an ongoing evaluation of the global ocean models in which the regional models for the LME will be nested, and joint training programs to develop the distributed teams of scientists and technicians required to support the global network. Globalization includes the development of international networks to coordinate activities, such as the Global HF Radar network supported by GEO, the global Everyone's Glider Organization supported by WMO and IOC, and the need for professional training supported by MTS.

  13. Streamflow and water-quality conditions, Wilsons Creek and James River, Springfield area, Missouri

    USGS Publications Warehouse

    Berkas, Wayne R.

    1982-01-01

    A network of water-quality-monitoring stations was established upstream and downstream from the Southwest Wastewater-Treatment Plant on Wilsons Creek to monitor the effects of sewage effluent on water quality. Data indicate that 82 percent of the time the flow in Wilsons Creek upstream from the wastewater-treatment plant is less than the effluent discharged from the plant. On October 15, 1977, an advanced wastewater-treatment facility was put into operation. Of the four water-quality indicators measured at the monitoring stations (specific conductance, dissolved oxygen, pH, and water temperature), only dissolved oxygen showed improvement downstream from the plant. During urban runoff, the specific conductance momentarily increased and dissolved-oxygen concentration momentarily decreased in Wilsons Creek upstream from the plant. Urban runoff was found to have no long-term effects on specific conductance and dissolved oxygen downstream from the plant before or after the addition of the advanced wastewater-treatment facility. Data collected monthly from the James River showed that the dissolved-oxygen concentrations and the total nitrite plus nitrate nitrogen concentrations increased, whereas the dissolved-manganese concentrations decreased after the advanced wastewater-treatment facility became operational.

  14. SUNRISE: A SpaceFibre Router

    NASA Astrophysics Data System (ADS)

    Parkes, Steve; McClements, Chris; McLaren, David; Florit, Albert Ferrer; Gonzalez Villafranca, Alberto

    2016-08-01

    SpaceFibre is a new generation of SpaceWire technology which is able to support the very high data- rates required by sensors like SAR and multi-spectral imagers. Data rates of between 1 and 16 Gbits/s are required to support several sensors currently being planned. In addition a mass-memory unit requires high performance networking to interconnect many memory modules. SpaceFibre runs over both electrical and fibre-optic media and provides and adds quality of service and fault detection, isolation and recovery technology to the network. SpaceFibre is compatible with the widely used SpaceWire protocol at the network level allowing existing SpaceWire devices to be readily incorporated into a SpaceFibre network. SpaceFibre provides 2 to 5 Gbits/s links (2.5 to 6.25 Gbits/s data signalling rate) which can be operated in parallel (multi-laning) to give higher data rates. STAR- Dundee with University of Dundee has designed and tested several SpaceFibre interface devices.The SUNRISE project is a UK Space Agency, Centre for Earth Observation and Space Technology (CEOI- ST) project in which STAR-Dundee and University of Dundee will design and prototype critical SpaceFibre router technology necessary for future on-board data- handling systems. This will lay a vital foundation for future very high data-rate sensor and telecommunications systems.This paper give a brief introduction to SpaceFibre, explains the operation of a SpaceFibre network, and then describes the SUNRISE SpaceFibre Router. The initial results of the SUNRISE project are described.

  15. Sub-Network Access Control Technology Demonstrator: Software Design of the Network Management System

    DTIC Science & Technology

    2002-08-01

    Canadian Operational Fleet. Requirements The proposed network management solution must provide the normal monitoring and configuration mechanisms generally...Joint Warrior Inter- operability Demonstrations (JWID) m and the Communication System Network Inter- Operability (CSNI) Navy Network Trials. In short...management functional area normally includes two main functions: fault isolation and diagnosis, and restoration of the system . In short, an operator

  16. Heterogeneous sharpness for cross-spectral face recognition

    NASA Astrophysics Data System (ADS)

    Cao, Zhicheng; Schmid, Natalia A.

    2017-05-01

    Matching images acquired in different electromagnetic bands remains a challenging problem. An example of this type of comparison is matching active or passive infrared (IR) against a gallery of visible face images, known as cross-spectral face recognition. Among many unsolved issues is the one of quality disparity of the heterogeneous images. Images acquired in different spectral bands are of unequal image quality due to distinct imaging mechanism, standoff distances, or imaging environment, etc. To reduce the effect of quality disparity on the recognition performance, one can manipulate images to either improve the quality of poor-quality images or to degrade the high-quality images to the level of the quality of their heterogeneous counterparts. To estimate the level of discrepancy in quality of two heterogeneous images a quality metric such as image sharpness is needed. It provides a guidance in how much quality improvement or degradation is appropriate. In this work we consider sharpness as a relative measure of heterogeneous image quality. We propose a generalized definition of sharpness by first achieving image quality parity and then finding and building a relationship between the image quality of two heterogeneous images. Therefore, the new sharpness metric is named heterogeneous sharpness. Image quality parity is achieved by experimentally finding the optimal cross-spectral face recognition performance where quality of the heterogeneous images is varied using a Gaussian smoothing function with different standard deviation. This relationship is established using two models; one of them involves a regression model and the other involves a neural network. To train, test and validate the model, we use composite operators developed in our lab to extract features from heterogeneous face images and use the sharpness metric to evaluate the face image quality within each band. Images from three different spectral bands visible light, near infrared, and short-wave infrared are considered in this work. Both error of a regression model and validation error of a neural network are analyzed.

  17. Instrumental measurement of odour nuisance in city agglomeration using electronic nose

    NASA Astrophysics Data System (ADS)

    Szulczyński, Bartosz; Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2018-01-01

    The paper describes an operation principle of odour nuisance monitoring network in a city agglomeration. Moreover, it presents the results of investigation on ambient air quality with respect to odour obtained during six-month period. The investigation was carried out using a network comprised of six prototypes of electronic nose and Nasal Ranger field olfactometers employed as a reference method. The monitoring network consisted of two measurement stations localized in a vicinity of crude oil processing plant and four stations localized near the main emitters of volatile odorous compounds such as sewage treatment plant, municipal landfill, phosphatic fertilizer production plant. The electronic nose prototype was equipped with a set of six semiconductor sensors by FIGARO Co. and one PID-type sensor. The field olfactometers were utilized for determination of mean concentration of odorants and for calibration of the electronic nose prototypes in order to provide their proper operation. Mean monthly values of odour concentration depended on the site of measurement and on meteorological parameters. They were within 0 - 6.0 ou/m3 range. Performed investigations revealed the possibility of electronic nose instrument application as a tool for monitoring of odour nuisance.

  18. Predicting subcontractor performance using web-based Evolutionary Fuzzy Neural Networks.

    PubMed

    Ko, Chien-Ho

    2013-01-01

    Subcontractor performance directly affects project success. The use of inappropriate subcontractors may result in individual work delays, cost overruns, and quality defects throughout the project. This study develops web-based Evolutionary Fuzzy Neural Networks (EFNNs) to predict subcontractor performance. EFNNs are a fusion of Genetic Algorithms (GAs), Fuzzy Logic (FL), and Neural Networks (NNs). FL is primarily used to mimic high level of decision-making processes and deal with uncertainty in the construction industry. NNs are used to identify the association between previous performance and future status when predicting subcontractor performance. GAs are optimizing parameters required in FL and NNs. EFNNs encode FL and NNs using floating numbers to shorten the length of a string. A multi-cut-point crossover operator is used to explore the parameter and retain solution legality. Finally, the applicability of the proposed EFNNs is validated using real subcontractors. The EFNNs are evolved using 22 historical patterns and tested using 12 unseen cases. Application results show that the proposed EFNNs surpass FL and NNs in predicting subcontractor performance. The proposed approach improves prediction accuracy and reduces the effort required to predict subcontractor performance, providing field operators with web-based remote access to a reliable, scientific prediction mechanism.

  19. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    PubMed

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.

  20. Predicting Subcontractor Performance Using Web-Based Evolutionary Fuzzy Neural Networks

    PubMed Central

    2013-01-01

    Subcontractor performance directly affects project success. The use of inappropriate subcontractors may result in individual work delays, cost overruns, and quality defects throughout the project. This study develops web-based Evolutionary Fuzzy Neural Networks (EFNNs) to predict subcontractor performance. EFNNs are a fusion of Genetic Algorithms (GAs), Fuzzy Logic (FL), and Neural Networks (NNs). FL is primarily used to mimic high level of decision-making processes and deal with uncertainty in the construction industry. NNs are used to identify the association between previous performance and future status when predicting subcontractor performance. GAs are optimizing parameters required in FL and NNs. EFNNs encode FL and NNs using floating numbers to shorten the length of a string. A multi-cut-point crossover operator is used to explore the parameter and retain solution legality. Finally, the applicability of the proposed EFNNs is validated using real subcontractors. The EFNNs are evolved using 22 historical patterns and tested using 12 unseen cases. Application results show that the proposed EFNNs surpass FL and NNs in predicting subcontractor performance. The proposed approach improves prediction accuracy and reduces the effort required to predict subcontractor performance, providing field operators with web-based remote access to a reliable, scientific prediction mechanism. PMID:23864830

  1. From Many to Many More: Instant Interoperability Through the Integrated Ocean Observing System Data Assembly Center

    NASA Astrophysics Data System (ADS)

    Burnett, W.; Bouchard, R.; Hervey, R.; Crout, R.; Luke, R.

    2008-12-01

    As the Integrated Ocean Observing System (IOOS) Data Assembly Center (DAC), NOAA's National Data Buoy Center (NDBC) collects data from many ocean observing systems, quality controls the data, and distributes them nationally and internationally. The DAC capabilities provide instant interoperability of any ocean observatory with the national and international agencies responsible for critical forecasts and warnings and with the national media. This interoperability is an important milestone in an observing system's designation as an operational system. Data collection begins with NDBC's own observing systems - Meteorological and Oceanographic Buoys and Coastal Stations, the Tropical Atmosphere Ocean Array, and the NOAA tsunameter network. Leveraging the data management functions that support NDBC systems, the DAC can support data partners including ocean observations from IOOS Regional Observing Systems, the meteorological observations from the National Water Level Observing Network, meteorological and oceanographic observations from the National Estuarine Research Reserve System, Integrated Coral Observing Network, merchant ship observations from the Voluntary Observing Ship program, and ocean current measurements from oil and gas platforms in the Gulf of Mexico and from Coastal HF Radars. The DAC monitors and quality controls IOOS Partner data alerting the data provider to outages and quality discrepancies. After performing automated and manual quality control procedures, the DAC prepares the observations for distribution. The primary means of data distribution is in standard World Meteorological Organization alphanumeric coded messages distributed via the Global Telecommunications System, NOAAPort, and Family of Services. Observing systems provide their data via ftp to an NDBC server using a simple XML. The DAC also posts data in real-time to the NDBC webpages in columnar text format and data plots that maritime interests (e.g., surfing, fishing, boating) widely use. The webpage text feeds the Dial-A-Buoy capability that reads the latest data from webpages and the latest NWS forecast for the station to a user via telephone. The DAC also operates a DODS/OPenDAP server to provide data in netCDF. Recently the DAC implemented the NOAA IOOS Data Integration Framework, which facilitates the exchange of data between IOOS Regional Observing Systems by standardizing data exchange formats and incorporating needed metadata for the correct application of the data. The DAC has become an OceanSITES Global Data Assembly Center - part of the Initial Global Observing System for Climate. Supported by the NOAA IOOS Program, the DAC provides round-the-clock monitoring, quality control, and data distribution to ensure that its IOOS Partners can conduct operations that meet the NOAA definition of: Sustained, systematic, reliable, and robust mission activities with an institutional commitment to deliver appropriate, cost-effective products and services.

  2. Research on the effects of wind power grid to the distribution network of Henan province

    NASA Astrophysics Data System (ADS)

    Liu, Yunfeng; Zhang, Jian

    2018-04-01

    With the draining of traditional energy, all parts of nation implement policies to develop new energy to generate electricity under the favorable national policy. The wind has no pollution, Renewable and other advantages. It has become the most popular energy among the new energy power generation. The development of wind power in Henan province started relatively late, but the speed of the development is fast. The wind power of Henan province has broad development prospects. Wind power has the characteristics of volatility and randomness. The wind power access to power grids will cause much influence on the power stability and the power quality of distribution network, and some areas have appeared abandon the wind phenomenon. So the study of wind power access to power grids and find out improvement measures is very urgent. Energy storage has the properties of the space transfer energy can stabilize the operation of power grid and improve the power quality.

  3. Parameter space for the collective laser coupling in the laser fusion driver based on the concept of fiber amplification network.

    PubMed

    Huang, Zhihua; Lin, Honghuan; Xu, Dangpeng; Li, Mingzhong; Wang, Jianjun; Deng, Ying; Zhang, Rui; Zhang, Yongliang; Tian, Xiaocheng; Wei, Xiaofeng

    2013-07-15

    Collective laser coupling of the fiber array in the inertial confinement fusion (ICF) laser driver based on the concept of fiber amplification network (FAN) is researched. The feasible parameter space is given for laser coupling of the fundamental, second and third harmonic waves by neglecting the influence of the frequency conversion on the beam quality under the assumption of beam quality factor conservation. Third harmonic laser coupling is preferred due to its lower output energy requirement from a single fiber amplifier. For coplanar fiber array, the energy requirement is around 0.4 J with an effective mode field diameter of around 500 μm while maintaining the fundamental mode operation which is more than one order of magnitude higher than what can be achieved with state-of-the-art technology. Novel waveguide structure needs to be developed to enlarge the fundamental mode size while mitigating the catastrophic self-focusing effect.

  4. Dispatching packets on a global combining network of a parallel computer

    DOEpatents

    Almasi, Gheorghe [Ardsley, NY; Archer, Charles J [Rochester, MN

    2011-07-19

    Methods, apparatus, and products are disclosed for dispatching packets on a global combining network of a parallel computer comprising a plurality of nodes connected for data communications using the network capable of performing collective operations and point to point operations that include: receiving, by an origin system messaging module on an origin node from an origin application messaging module on the origin node, a storage identifier and an operation identifier, the storage identifier specifying storage containing an application message for transmission to a target node, and the operation identifier specifying a message passing operation; packetizing, by the origin system messaging module, the application message into network packets for transmission to the target node, each network packet specifying the operation identifier and an operation type for the message passing operation specified by the operation identifier; and transmitting, by the origin system messaging module, the network packets to the target node.

  5. Mobile Videoconferencing Apps for Telemedicine

    PubMed Central

    Liu, Wei-Li; Locatis, Craig; Ackerman, Michael

    2016-01-01

    Abstract Introduction: The quality and performance of several videoconferencing applications (apps) tested on iOS (Apple, Cupertino, CA) and Android™ (Google, Mountain View, CA) mobile platforms using Wi-Fi (802.11), third-generation (3G), and fourth-generation (4G) cellular networks are described. Materials and Methods: The tests were done to determine how well apps perform compared with videoconferencing software installed on computers or with more traditional videoconferencing using dedicated hardware. The rationale for app assessment and the testing methodology are described. Results: Findings are discussed in relation to operating system platform (iOS or Android) for which the apps were designed and the type of network (Wi-Fi, 3G, or 4G) used. The platform, network, and apps interact, and it is impossible to discuss videoconferencing experienced on mobile devices in relation to one of these factors without referencing the others. Conclusions: Apps for mobile devices can vary significantly from other videoconferencing software or hardware. App performance increased over the testing period due to improvements in network infrastructure and how apps manage bandwidth. PMID:26204322

  6. Mobile Videoconferencing Apps for Telemedicine.

    PubMed

    Zhang, Kai; Liu, Wei-Li; Locatis, Craig; Ackerman, Michael

    2016-01-01

    The quality and performance of several videoconferencing applications (apps) tested on iOS (Apple, Cupertino, CA) and Android (Google, Mountain View, CA) mobile platforms using Wi-Fi (802.11), third-generation (3G), and fourth-generation (4G) cellular networks are described. The tests were done to determine how well apps perform compared with videoconferencing software installed on computers or with more traditional videoconferencing using dedicated hardware. The rationale for app assessment and the testing methodology are described. Findings are discussed in relation to operating system platform (iOS or Android) for which the apps were designed and the type of network (Wi-Fi, 3G, or 4G) used. The platform, network, and apps interact, and it is impossible to discuss videoconferencing experienced on mobile devices in relation to one of these factors without referencing the others. Apps for mobile devices can vary significantly from other videoconferencing software or hardware. App performance increased over the testing period due to improvements in network infrastructure and how apps manage bandwidth.

  7. Providing the full DDF link protection for bus-connected SIEPON based system architecture

    NASA Astrophysics Data System (ADS)

    Hwang, I.-Shyan; Pakpahan, Andrew Fernando; Liem, Andrew Tanny; Nikoukar, AliAkbar

    2016-09-01

    Currently a massive amount of traffic per second is delivered through EPON systems, one of the prominent access network technologies for delivering the next generation network. Therefore, it is vital to keep the EPON optical distribution network (ODN) working by providing the necessity protection mechanism in the deployed devices; otherwise, when failures occur it will cause a great loss for both network operators and business customers. In this paper, we propose a bus-connected architecture to protect and recover distribution drop fiber (DDF) link faults or transceiver failures at ONU(s) in SIEPON system. The proposed architecture provides a cost-effective architecture, which delivers the high fault-tolerance in handling multiple DDF faults, while also providing flexibility in choosing the backup ONU assignments. Simulation results show that the proposed architecture provides the reliability and maintains quality of service (QoS) performance in terms of mean packet delay, system throughput, packet loss and EF jitter when DDF link failures occur.

  8. Citizen Science Seismic Stations for Monitoring Regional and Local Events

    NASA Astrophysics Data System (ADS)

    Zucca, J. J.; Myers, S.; Srikrishna, D.

    2016-12-01

    The earth has tens of thousands of seismometers installed on its surface or in boreholes that are operated by many organizations for many purposes including the study of earthquakes, volcanos, and nuclear explosions. Although global networks such as the Global Seismic Network and the International Monitoring System do an excellent job of monitoring nuclear test explosions and other seismic events, their thresholds could be lowered with the addition of more stations. In recent years there has been interest in citizen-science approaches to augment government-sponsored monitoring networks (see, for example, Stubbs and Drell, 2013). A modestly-priced seismic station that could be purchased by citizen scientists could enhance regional and local coverage of the GSN, IMS, and other networks if those stations are of high enough quality and distributed optimally. In this paper we present a minimum set of hardware and software specifications that a citizen seismograph station would need in order to add value to global networks. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  9. ATM over hybrid fiber-coaxial cable networks: practical issues in deploying residential ATM services

    NASA Astrophysics Data System (ADS)

    Laubach, Mark

    1996-11-01

    Residential broadband access network technology based on asynchronous transfer modem (ATM) will soon reach commercial availability. The capabilities provided by ATM access network promise integrated services bandwidth available in excess of those provided by traditional twisted pair copper wire public telephone networks. ATM to the side of the home placed need quality of service capability closest to the subscriber allowing immediate support for Internet services and traditional voice telephony. Other services such as desktop video teleconferencing and enhanced server-based application support can be added as part of future evolution of the network. Additionally, advanced subscriber home networks can be supported easily. This paper presents an updated summary of the standardization efforts for the ATM over HFC definition work currently taking place in the ATM forum's residential broadband working group and the standards progress in the IEEE 802.14 cable TV media access control and physical protocol working group. This update is fundamental for establishing the foundation for delivering ATM-based integrated services via a cable TV network. An economic model for deploying multi-tiered services is presenting showing that a single-tier service is insufficient for a viable cable operator business. Finally, the use of an ATM based system lends itself well to various deployment scenarios of synchronous optical networks (SONET).

  10. Wireless in-situ Sensor Network for Agriculture and Water Monitoring on a River Basin Scale in Southern Finland: Evaluation from a Data User’s Perspective

    PubMed Central

    Kotamäki, Niina; Thessler, Sirpa; Koskiaho, Jari; Hannukkala, Asko O.; Huitu, Hanna; Huttula, Timo; Havento, Jukka; Järvenpää, Markku

    2009-01-01

    Sensor networks are increasingly being implemented for environmental monitoring and agriculture to provide spatially accurate and continuous environmental information and (near) real-time applications. These networks provide a large amount of data which poses challenges for ensuring data quality and extracting relevant information. In the present paper we describe a river basin scale wireless sensor network for agriculture and water monitoring. The network, called SoilWeather, is unique and the first of this type in Finland. The performance of the network is assessed from the user and maintainer perspectives, concentrating on data quality, network maintenance and applications. The results showed that the SoilWeather network has been functioning in a relatively reliable way, but also that the maintenance and data quality assurance by automatic algorithms and calibration samples requires a lot of effort, especially in continuous water monitoring over large areas. We see great benefits on sensor networks enabling continuous, real-time monitoring, while data quality control and maintenance efforts highlight the need for tight collaboration between sensor and sensor network owners to decrease costs and increase the quality of the sensor data in large scale applications. PMID:22574050

  11. International VLBI Service for Geodesy and Astrometry. Delivering high-quality products and embarking on observations of the next generation

    NASA Astrophysics Data System (ADS)

    Nothnagel, A.; Artz, T.; Behrend, D.; Malkin, Z.

    2017-07-01

    The International VLBI Service for Geodesy and Astrometry (IVS) regularly produces high-quality Earth orientation parameters from observing sessions employing extensive networks or individual baselines. The master schedule is designed according to the telescope days committed by the stations and by the need for dense sampling of the Earth orientation parameters (EOP). In the pre-2011 era, the network constellations with their number of telescopes participating were limited by the playback and baseline capabilities of the hardware (Mark4) correlators. This limitation was overcome by the advent of software correlators, which can now accommodate many more playback units in a flexible configuration. In this paper, we describe the current operations of the IVS with special emphasis on the quality of the polar motion results since these are the only EOP components which can be validated against independent benchmarks. The polar motion results provided by the IVS have improved continuously over the years, now providing an agreement with IGS results at the level of 20-25 μas in a WRMS sense. At the end of the paper, an outlook is given for the realization of the VLBI Global Observing System.

  12. Solar Energy Meteorological Research and Training Site: Region 5. Annual report, 30 September 1977-29 September 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, C.R.N.; Hewson, E.W.

    The primary facility which is to be a benchmark site for the acquisition of research quality solar radiation and solar energy related meteorological data has been set up and will be fully operational in the near future. The training program has been established with the introduction of two, two-quarter courses on solar radiation and meteorological measurements and on atmospheric radiative processes. Also, as part of the training program, a week-long workshop on solar energy measurement and instrumentation was conducted during the summer of '78 and a series of seminars on solar energy related topics, catering to both professionals and non-professionals,more » was arranged during the 1977-78 academic year. A meeting of solar radiation scientists from the five states of the region was held in Corvallis (August '78) to explore the feasibility of setting up a regional network of stations to acquire research quality solar radiation and meteorological data. Useful global irradiance measurements have been made at the five sites, making up the general quality network in Oregon, over the greater part of the year.« less

  13. Data Quality Control: Challenges, Methods, and Solutions from an Eco-Hydrologic Instrumentation Network

    NASA Astrophysics Data System (ADS)

    Eiriksson, D.; Jones, A. S.; Horsburgh, J. S.; Cox, C.; Dastrup, D.

    2017-12-01

    Over the past few decades, advances in electronic dataloggers and in situ sensor technology have revolutionized our ability to monitor air, soil, and water to address questions in the environmental sciences. The increased spatial and temporal resolution of in situ data is alluring. However, an often overlooked aspect of these advances are the challenges data managers and technicians face in performing quality control on millions of data points collected every year. While there is general agreement that high quantities of data offer little value unless the data are of high quality, it is commonly understood that despite efforts toward quality assurance, environmental data collection occasionally goes wrong. After identifying erroneous data, data managers and technicians must determine whether to flag, delete, leave unaltered, or retroactively correct suspect data. While individual instrumentation networks often develop their own QA/QC procedures, there is a scarcity of consensus and literature regarding specific solutions and methods for correcting data. This may be because back correction efforts are time consuming, so suspect data are often simply abandoned. Correction techniques are also rarely reported in the literature, likely because corrections are often performed by technicians rather than the researchers who write the scientific papers. Details of correction procedures are often glossed over as a minor component of data collection and processing. To help address this disconnect, we present case studies of quality control challenges, solutions, and lessons learned from a large scale, multi-watershed environmental observatory in Northern Utah that monitors Gradients Along Mountain to Urban Transitions (GAMUT). The GAMUT network consists of over 40 individual climate, water quality, and storm drain monitoring stations that have collected more than 200 million unique data points in four years of operation. In all of our examples, we emphasize that scientists should remain skeptical and seek independent verification of sensor data, even for sensors purchased from trusted manufacturers.

  14. Optimal design and operation of booster chlorination stations layout in water distribution systems.

    PubMed

    Ohar, Ziv; Ostfeld, Avi

    2014-07-01

    This study describes a new methodology for the disinfection booster design, placement, and operation problem in water distribution systems. Disinfectant residuals, which are in most cases chlorine residuals, are assumed to be sufficient to prevent growth of pathogenic bacteria, yet low enough to avoid taste and odor problems. Commonly, large quantities of disinfectants are released at the sources outlets for preserving minimum residual disinfectant concentrations throughout the network. Such an approach can cause taste and odor problems near the disinfectant injection locations, but more important hazardous excessive disinfectant by-product formations (DBPs) at the far network ends, of which some may be carcinogenic. To cope with these deficiencies booster chlorination stations were suggested to be placed at the distribution system itself and not just at the sources, motivating considerable research in recent years on placement, design, and operation of booster chlorination stations in water distribution systems. The model formulated and solved herein is aimed at setting the required chlorination dose of the boosters for delivering water at acceptable residual chlorine and TTHM concentrations for minimizing the overall cost of booster placement, construction, and operation under extended period hydraulic simulation conditions through utilizing a multi-species approach. The developed methodology links a genetic algorithm with EPANET-MSX, and is demonstrated through base runs and sensitivity analyses on a network example application. Two approaches are suggested for dealing with water quality initial conditions and species periodicity: (1) repetitive cyclical simulation (RCS), and (2) cyclical constrained species (CCS). RCS was found to be more robust but with longer computational time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Ultra-low power high precision magnetotelluric receiver array based customized computer and wireless sensor network

    NASA Astrophysics Data System (ADS)

    Chen, R.; Xi, X.; Zhao, X.; He, L.; Yao, H.; Shen, R.

    2016-12-01

    Dense 3D magnetotelluric (MT) data acquisition owns the benefit of suppressing the static shift and topography effect, can achieve high precision and high resolution inversion for underground structure. This method may play an important role in mineral exploration, geothermal resources exploration, and hydrocarbon exploration. It's necessary to reduce the power consumption greatly of a MT signal receiver for large-scale 3D MT data acquisition while using sensor network to monitor data quality of deployed MT receivers. We adopted a series of technologies to realized above goal. At first, we designed an low-power embedded computer which can couple with other parts of MT receiver tightly and support wireless sensor network. The power consumption of our embedded computer is less than 1 watt. Then we designed 4-channel data acquisition subsystem which supports 24-bit analog-digital conversion, GPS synchronization, and real-time digital signal processing. Furthermore, we developed the power supply and power management subsystem for MT receiver. At last, a series of software, which support data acquisition, calibration, wireless sensor network, and testing, were developed. The software which runs on personal computer can monitor and control over 100 MT receivers on the field for data acquisition and quality control. The total power consumption of the receiver is about 2 watts at full operation. The standby power consumption is less than 0.1 watt. Our testing showed that the MT receiver can acquire good quality data at ground with electrical dipole length as 3 m. Over 100 MT receivers were made and used for large-scale geothermal exploration in China with great success.

  16. Application of ion-sensitive sensors in water quality monitoring.

    PubMed

    Winkler, S; Rieger, L; Saracevic, E; Pressl, A; Gruber, G

    2004-01-01

    Within the last years a trend towards in-situ monitoring can be observed, i.e. most new sensors for water quality monitoring are designed for direct installation in the medium, compact in size and use measurement principles which minimise maintenance demand. Ion-sensitive sensors (Ion-Sensitive-Electrode--ISE) are based on a well known measurement principle and recently some manufacturers have released probe types which are specially adapted for application in water quality monitoring. The function principle of ISE-sensors, their advantages, limitations and the different methods for sensor calibration are described. Experiences with ISE-sensors from applications in sewer networks, at different sampling points within wastewater treatment plants and for surface water monitoring are reported. An estimation of investment and operation costs in comparison to other sensor types is given.

  17. The relationship between context, structure, and processes with outcomes of 6 regional diabetes networks in Europe.

    PubMed

    Mahdavi, Mahdi; Vissers, Jan; Elkhuizen, Sylvia; van Dijk, Mattees; Vanhala, Antero; Karampli, Eleftheria; Faubel, Raquel; Forte, Paul; Coroian, Elena; van de Klundert, Joris

    2018-01-01

    While health service provisioning for the chronic condition Type 2 Diabetes (T2D) often involves a network of organisations and professionals, most evidence on the relationships between the structures and processes of service provisioning and the outcomes considers single organisations or solo practitioners. Extending Donabedian's Structure-Process-Outcome (SPO) model, we investigate how differences in quality of life, effective coverage of diabetes, and service satisfaction are associated with differences in the structures, processes, and context of T2D services in six regions in Finland, Germany, Greece, Netherlands, Spain, and UK. Data collection consisted of: a) systematic modelling of provider network's structures and processes, and b) a cross-sectional survey of patient reported outcomes and other information. The survey resulted in data from 1459 T2D patients, during 2011-2012. Stepwise linear regression models were used to identify how independent cumulative proportion of variance in quality of life and service satisfaction are related to differences in context, structure and process. The selected context, structure and process variables are based on Donabedian's SPO model, a service quality research instrument (SERVQUAL), and previous organization and professional level evidence. Additional analysis deepens the possible bidirectional relation between outcomes and processes. The regression models explain 44% of variance in service satisfaction, mostly by structure and process variables (such as human resource use and the SERVQUAL dimensions). The models explained 23% of variance in quality of life between the networks, much of which is related to contextual variables. Our results suggest that effectiveness of A1c control is negatively correlated with process variables such as total hours of care provided per year and cost of services per year. While the selected structure and process variables explain much of the variance in service satisfaction, this is less the case for quality of life. Moreover, it appears that the effect of the clinical outcome A1c control on processes is stronger than the other way around, as poorer control seems to relate to more service use, and higher cost. The standardized operational models used in this research prove to form a basis for expanding the network level evidence base for effective T2D service provisioning.

  18. Unmanned Aerial Systems, Moored Balloons, and the U.S. Department of Energy ARM Facilities in Alaska

    NASA Astrophysics Data System (ADS)

    Ivey, Mark; Verlinde, Johannes

    2014-05-01

    The U.S. Department of Energy (DOE), through its scientific user facility, the Atmospheric Radiation Measurement (ARM) Climate Research Facility, provides scientific infrastructure and data to the international Arctic research community via its research sites located on the North Slope of Alaska. Facilities and infrastructure to support operations of unmanned aerial systems for science missions in the Arctic and North Slope of Alaska were established at Oliktok Point Alaska in 2013. Tethered instrumented balloons will be used in the near future to make measurements of clouds in the boundary layer including mixed-phase clouds. The DOE ARM Program has operated an atmospheric measurement facility in Barrow, Alaska, since 1998. Major upgrades to this facility, including scanning radars, were added in 2010. Arctic Observing Networks are essential to meet growing policy, social, commercial, and scientific needs. Calibrated, high-quality arctic geophysical datasets that span ten years or longer are especially important for climate studies, climate model initializations and validations, and for related climate policy activities. For example, atmospheric data and derived atmospheric forcing estimates are critical for sea-ice simulations. International requirements for well-coordinated, long-term, and sustained Arctic Observing Networks and easily-accessible data sets collected by those networks have been recognized by many high-level workshops and reports (Arctic Council Meetings and workshops, National Research Council reports, NSF workshops and others). The recent Sustaining Arctic Observation Network (SAON) initiative sponsored a series of workshops to "develop a set of recommendations on how to achieve long-term Arctic-wide observing activities that provide free, open, and timely access to high-quality data that will realize pan-Arctic and global value-added services and provide societal benefits." This poster will present information on opportunities for members of the arctic research community to make atmospheric measurements using unmanned aerial systems or tethered balloons.

  19. Computer network defense system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urias, Vincent; Stout, William M. S.; Loverro, Caleb

    A method and apparatus for protecting virtual machines. A computer system creates a copy of a group of the virtual machines in an operating network in a deception network to form a group of cloned virtual machines in the deception network when the group of the virtual machines is accessed by an adversary. The computer system creates an emulation of components from the operating network in the deception network. The components are accessible by the group of the cloned virtual machines as if the group of the cloned virtual machines was in the operating network. The computer system moves networkmore » connections for the group of the virtual machines in the operating network used by the adversary from the group of the virtual machines in the operating network to the group of the cloned virtual machines, enabling protecting the group of the virtual machines from actions performed by the adversary.« less

  20. User Access Management Based on Network Pricing for Social Network Applications

    PubMed Central

    Ma, Xingmin; Gu, Qing

    2018-01-01

    Social applications play a very important role in people’s lives, as users communicate with each other through social networks on a daily basis. This presents a challenge: How does one receive high-quality service from social networks at a low cost? Users can access different kinds of wireless networks from various locations. This paper proposes a user access management strategy based on network pricing such that networks can increase its income and improve service quality. Firstly, network price is treated as an optimizing access parameter, and an unascertained membership algorithm is used to make pricing decisions. Secondly, network price is adjusted dynamically in real time according to network load. Finally, selecting a network is managed and controlled in terms of the market economy. Simulation results show that the proposed scheme can effectively balance network load, reduce network congestion, improve the user's quality of service (QoS) requirements, and increase the network’s income. PMID:29495252

  1. Automated Video Quality Assessment for Deep-Sea Video

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Hoeberechts, M.; Kalmbach, A.; Sadhu, T.; Branzan Albu, A.; Glotin, H.; Jeffries, M. A.; Bui, A. O. V.

    2015-12-01

    Video provides a rich source of data for geophysical analysis, often supplying detailed information about the environment when other instruments may not. This is especially true of deep-sea environments, where direct visual observations cannot be made. As computer vision techniques improve and volumes of video data increase, automated video analysis is emerging as a practical alternative to labor-intensive manual analysis. Automated techniques can be much more sensitive to video quality than their manual counterparts, so performing quality assessment before doing full analysis is critical to producing valid results.Ocean Networks Canada (ONC), an initiative of the University of Victoria, operates cabled ocean observatories that supply continuous power and Internet connectivity to a broad suite of subsea instruments from the coast to the deep sea, including video and still cameras. This network of ocean observatories has produced almost 20,000 hours of video (about 38 hours are recorded each day) and an additional 8,000 hours of logs from remotely operated vehicle (ROV) dives. We begin by surveying some ways in which deep-sea video poses challenges for automated analysis, including: 1. Non-uniform lighting: Single, directional, light sources produce uneven luminance distributions and shadows; remotely operated lighting equipment are also susceptible to technical failures. 2. Particulate noise: Turbidity and marine snow are often present in underwater video; particles in the water column can have sharper focus and higher contrast than the objects of interest due to their proximity to the light source and can also influence the camera's autofocus and auto white-balance routines. 3. Color distortion (low contrast): The rate of absorption of light in water varies by wavelength, and is higher overall than in air, altering apparent colors and lowering the contrast of objects at a distance.We also describe measures under development at ONC for detecting and mitigating these effects. These steps include filtering out unusable data, color and luminance balancing, and choosing the most appropriate image descriptors. We apply these techniques to generate automated quality assessment of video data and illustrate their utility with an example application where we perform vision-based substrate classification.

  2. ACTRIS non-methane hydrocarbon intercomparison experiment in Europe to support WMO-GAW and EMEP observation networks

    NASA Astrophysics Data System (ADS)

    Hoerger, C. C.; Werner, A.; Plass-Duelmer, C.; Reimann, S.; Eckart, E.; Steinbrecher, R.; Aalto, J.; Arduini, J.; Bonnaire, N.; Cape, J. N.; Colomb, A.; Connolly, R.; Diskova, J.; Dumitrean, P.; Ehlers, C.; Gros, V.; Hakola, H.; Hill, M.; Hopkins, J. R.; Jäger, J.; Junek, R.; Kajos, M. K.; Klemp, D.; Leuchner, M.; Lewis, A. C.; Locoge, N.; Maione, M.; Martin, D.; Michl, K.; Nemitz, E.; O'Doherty, S.; Pérez Ballesta, P.; Ruuskanen, T. M.; Sauvage, S.; Schmidbauer, N.; Spain, T. G.; Straube, E.; Vana, M.; Vollmer, M. K.; Wegener, R.; Wenger, A.

    2014-10-01

    The performance of 20 European laboratories involved in long-term non-methane hydrocarbon (NMHC) measurements within the framework of Global Atmosphere Watch (GAW) and European Monitoring and Evaluation Programme (EMEP) was assessed with respect to the ACTRIS (Aerosols, Clouds, and Trace gases Research InfraStructure Network) and GAW data quality objectives (DQOs). Compared to previous intercomparisons the DQOs of ACTRIS are much more demanding with deviations to a reference value of less than 5% and repeatability of better than 2% for mole fractions above 0.1 nmol mol-1. The participants were asked to measure both a 30 component NMHC mixture in nitrogen (NMHC_N2) at approximately 1 nmol mol-1 and whole air (NMHC_air), following a standardised operation procedure including zero- and calibration gas measurements. Furthermore, they had to report details on their instruments and they were asked to assess measurement uncertainties. The NMHCs were analysed either by gas chromatography-flame ionisation detection or gas chromatography-mass spectrometer methods. Most systems performed well for the NMHC_N2 measurements (88% of the reported values were within the GAW DQOs and even 58% within the ACTRIS DQOs). For NMHC_air generally more frequent and larger deviations to the assigned values were observed compared to NMHC_N2 (77% of the reported values were within the GAW DQOs, but only 48% within the ACTRIS DQOs). Important contributors to the poorer performance in NMHC_air compared to NMHC_N2 were a more complex matrix and a larger span of NMHC mole fractions (0.03-2.5 nmol mol-1). Issues, which affected both NMHC mixtures, are the usage of direct vs. two-step calibration, breakthrough of C2-C3 hydrocarbons, blank values in zero-gas measurements (especially for those systems using a Nafion® Dryer), adsorptive losses of aromatic compounds, and insufficient chromatographic resolution. Essential for high-quality results are experienced operators, a comprehensive quality assurance and quality control, well characterised systems, and sufficient man-power to operate the systems and evaluate the data.

  3. Updates from the AmeriFlux Management Project Tech Team

    NASA Astrophysics Data System (ADS)

    Biraud, S.; Chan, S.; Dengel, S.; Polonik, P.; Hanson, C. V.; Billesbach, D. P.; Torn, M. S.

    2017-12-01

    The goal of AmeriFlux is to develop a network of long-term flux sites for quantifying and understanding the role of the terrestrial biosphere in global climate and environmental change. The AmeriFlux Management Program (AMP) Tech Team at LBNL strengthens the AmeriFlux Network by (1) standardizing operational practices, (2) developing calibration and maintenance routines, and (3) setting clear data quality goals. In this poster we will present results and recent progress in three areas: IRGA intercomparison experiment in cooperation with UC Davis, and main manufacturers of sensors used in the AmeriFlux network (LI-COR, Picarro, and Campbell Scientific). Gill sonic anemometers characterization in collaboration with John Frank and Bill Massman (US Forest Service) following the discovery of a significant firmware problem in commonly used Gill Sonic anemometer, Unmanned aerial systems (UAS), and sensors systematically used at AmeriFlux sites to improve site characterization.

  4. Effects of Social Networks on the Quality of Life in an Elder and Middle-Aged Deaf Community Sample

    ERIC Educational Resources Information Center

    Gerich, Joachim; Fellinger, Johannes

    2012-01-01

    This article endeavors to investigate the role of social networks in contributing to the quality of life of an elder and middle-aged Deaf population. In particular, it poses the question of whether a certain network composition (deaf and hearing network persons) provides positive resources to improve quality of life and attempts to identify…

  5. Data quality assessment for comparative effectiveness research in distributed data networks

    PubMed Central

    Brown, Jeffrey; Kahn, Michael; Toh, Sengwee

    2015-01-01

    Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049

  6. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 3 2011-10-01 2011-10-01 false Nondiscriminatory access to unbundled network... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  7. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 3 2010-10-01 2010-10-01 false Nondiscriminatory access to unbundled network... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  8. An IEEE 1451.1 Architecture for ISHM Applications

    NASA Technical Reports Server (NTRS)

    Morris, Jon A.; Turowski, Mark; Schmalzel, John L.; Figueroa, Jorge F.

    2007-01-01

    The IEEE 1451.1 Standard for a Smart Transducer Interface defines a common network information model for connecting and managing smart elements in control and data acquisition networks using network-capable application processors (NCAPs). The Standard is a network-neutral design model that is easily ported across operating systems and physical networks for implementing complex acquisition and control applications by simply plugging in the appropriate network level drivers. To simplify configuration and tracking of transducer and actuator details, the family of 1451 standards defines a Transducer Electronic Data Sheet (TEDS) that is associated with each physical element. The TEDS contains all of the pertinent information about the physical operations of a transducer (such as operating regions, calibration tables, and manufacturer information), which the NCAP uses to configure the system to support a specific transducer. The Integrated Systems Health Management (ISHM) group at NASA's John C. Stennis Space Center (SSC) has been developing an ISHM architecture that utilizes IEEE 1451.1 as the primary configuration and data acquisition mechanism for managing and collecting information from a network of distributed intelligent sensing elements. This work has involved collaboration with other NASA centers, universities and aerospace industries to develop IEEE 1451.1 compliant sensors and interfaces tailored to support health assessment of complex systems. This paper and presentation describe the development and implementation of an interface for the configuration, management and communication of data, information and knowledge generated by a distributed system of IEEE 1451.1 intelligent elements monitoring a rocket engine test system. In this context, an intelligent element is defined as one incorporating support for the IEEE 1451.x standards and additional ISHM functions. Our implementation supports real-time collection of both measurement data (raw ADC counts and converted engineering units) and health statistics produced by each intelligent element. The handling of configuration, calibration and health information is automated by using the TEDS in combination with other electronic data sheets extensions to convey health parameters. By integrating the IEEE 1451.1 Standard for a Smart Transducer Interface with ISHM technologies, each element within a complex system becomes a highly flexible computation engine capable of self-validation and performing other measures of the quality of information it is producing.

  9. Air concentrations of volatile compounds near oil and gas production: a community-based exploratory study.

    PubMed

    Macey, Gregg P; Breech, Ruth; Chernaik, Mark; Cox, Caroline; Larson, Denny; Thomas, Deb; Carpenter, David O

    2014-10-30

    Horizontal drilling, hydraulic fracturing, and other drilling and well stimulation technologies are now used widely in the United States and increasingly in other countries. They enable increases in oil and gas production, but there has been inadequate attention to human health impacts. Air quality near oil and gas operations is an underexplored human health concern for five reasons: (1) prior focus on threats to water quality; (2) an evolving understanding of contributions of certain oil and gas production processes to air quality; (3) limited state air quality monitoring networks; (4) significant variability in air emissions and concentrations; and (5) air quality research that misses impacts important to residents. Preliminary research suggests that volatile compounds, including hazardous air pollutants, are of potential concern. This study differs from prior research in its use of a community-based process to identify sampling locations. Through this approach, we determine concentrations of volatile compounds in air near operations that reflect community concerns and point to the need for more fine-grained and frequent monitoring at points along the production life cycle. Grab and passive air samples were collected by trained volunteers at locations identified through systematic observation of industrial operations and air impacts over the course of resident daily routines. A total of 75 volatile organics were measured using EPA Method TO-15 or TO-3 by gas chromatography/mass spectrometry. Formaldehyde levels were determined using UMEx 100 Passive Samplers. Levels of eight volatile chemicals exceeded federal guidelines under several operational circumstances. Benzene, formaldehyde, and hydrogen sulfide were the most common compounds to exceed acute and other health-based risk levels. Air concentrations of potentially dangerous compounds and chemical mixtures are frequently present near oil and gas production sites. Community-based research can provide an important supplement to state air quality monitoring programs.

  10. Successful integration efforts in water quality from the integrated Ocean Observing System Regional Associations and the National Water Quality Monitoring Network

    USGS Publications Warehouse

    Ragsdale, R.; Vowinkel, E.; Porter, D.; Hamilton, P.; Morrison, R.; Kohut, J.; Connell, B.; Kelsey, H.; Trowbridge, P.

    2011-01-01

    The Integrated Ocean Observing System (IOOS??) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  11. Water quality success stories: Integrated assessments from the IOOS regional associations and national water quality monitoring network

    USGS Publications Warehouse

    Ragsdale, Rob; Vowinkel, Eric; Porter, Dwayne; Hamilton, Pixie; Morrison, Ru; Kohut, Josh; Connell, Bob; Kelsey, Heath; Trowbridge, Phil

    2011-01-01

    The Integrated Ocean Observing System (IOOS®) Regional Associations and Interagency Partners hosted a water quality workshop in January 2010 to discuss issues of nutrient enrichment and dissolved oxygen depletion (hypoxia), harmful algal blooms (HABs), and beach water quality. In 2007, the National Water Quality Monitoring Council piloted demonstration projects as part of the National Water Quality Monitoring Network (Network) for U.S. Coastal Waters and their Tributaries in three IOOS Regional Associations, and these projects are ongoing. Examples of integrated science-based solutions to water quality issues of major concern from the IOOS regions and Network demonstration projects are explored in this article. These examples illustrate instances where management decisions have benefited from decision-support tools that make use of interoperable data. Gaps, challenges, and outcomes are identified, and a proposal is made for future work toward a multiregional water quality project for beach water quality.

  12. CCSDS Time-Critical Onboard Networking Service

    NASA Technical Reports Server (NTRS)

    Parkes, Steve; Schnurr, Rick; Marquart, Jane; Menke, Greg; Ciccone, Massimiliano

    2006-01-01

    The Consultative Committee for Space Data Systems (CCSDS) is developing recommendations for communication services onboard spacecraft. Today many different communication buses are used on spacecraft requiring software with the same basic functionality to be rewritten for each type of bus. This impacts on the application software resulting in custom software for almost every new mission. The Spacecraft Onboard Interface Services (SOIS) working group aims to provide a consistent interface to various onboard buses and sub-networks, enabling a common interface to the application software. The eventual goal is reusable software that can be easily ported to new missions and run on a range of onboard buses without substantial modification. The system engineer will then be able to select a bus based on its performance, power, etc and be confident that a particular choice of bus will not place excessive demands on software development. This paper describes the SOIS Intra-Networking Service which is designed to enable data transfer and multiplexing of a variety of internetworking protocols with a range of quality of service support, over underlying heterogeneous data links. The Intra-network service interface provides users with a common Quality of Service interface when transporting data across a variety of underlying data links. Supported Quality of Service (QoS) elements include: Priority, Resource Reservation and Retry/Redundancy. These three QoS elements combine and map into four TCONS services for onboard data communications: Best Effort, Assured, Reserved, and Guaranteed. Data to be transported is passed to the Intra-network service with a requested QoS. The requested QoS includes the type of service, priority and where appropriate, a channel identifier. The data is de-multiplexed, prioritized, and the required resources for transport are allocated. The data is then passed to the appropriate data link for transfer across the bus. The SOIS supported data links may inherently provide the quality of service support requested by the intra-network layer. In the case where the data link does not have the required level of support, the missing functionality is added by SOIS. As a result of this architecture, re-usable software applications can be designed and used across missions thereby promoting common mission operations. In addition, the protocol multiplexing function enables the blending of multiple onboard networks. This paper starts by giving an overview of the SOIS architecture in section 11, illustrating where the TCONS services fit into the overall architecture. It then describes the quality of service approach adopted, in section III. The prototyping efforts that have been going on are introduced in section JY. Finally, in section V the current status of the CCSDS recommendations is summarized.

  13. Record-high specific conductance and water temperature in San Francisco Bay during water year 2015

    USGS Publications Warehouse

    Work, Paul A.; Downing-Kunz, Maureen; Livsey, Daniel N.

    2017-02-22

    The San Francisco estuary is commonly defined to include San Francisco Bay (bay) and the adjacent Sacramento–San Joaquin River Delta (delta). The U.S. Geological Survey (USGS) has operated a high-frequency (15-minute sampling interval) water-quality monitoring network in San Francisco Bay since the late 1980s (Buchanan and others, 2014). This network includes 19 stations at which sustained measurements have been made in the bay; currently, 8 stations are in operation (fig. 1). All eight stations are equipped with specific conductance (which can be related to salinity) and water-temperature sensors. Water quality in the bay constantly changes as ocean tides force seawater in and out of the bay, and river inflows—the most significant coming from the delta—vary on time scales ranging from those associated with storms to multiyear droughts. This monitoring network was designed to observe and characterize some of these changes in the bay across space and over time. The data demonstrate a high degree of variability in both specific conductance and temperature at time scales from tidal to annual and also reveal longer-term changes that are likely to influence overall environmental health in the bay.In water year (WY) 2015 (October 1, 2014, through September 30, 2015), as in the preceding water year (Downing-Kunz and others, 2015), the high-frequency measurements revealed record-high values of specific conductance and water temperature at several stations during a period of reduced freshwater inflow from the delta and other tributaries because of persistent, severe drought conditions in California. This report briefly summarizes observations for WY 2015 and compares them to previous years that had different levels of freshwater inflow.

  14. Management of Paediatric Testicular Torsion - Are we adhering to Royal College of Surgeons (RCS) recommendations.

    PubMed

    Thakkar, H S; Yardley, I; Kufeji, D

    2018-05-01

    Introduction In 2015, the Royal College of Surgeons of England (RCS) commissioned the East Midlands Clinical Network to develop a set of guidelines for the management of paediatric torsion. Two quality measures identified were the provision of surgery locally where possible and 100% of explorations within three hours. We sought to assess the adherence to these quality measures within our referral network. Materials and methods Retrospective data were collected for all paediatric scrotal explorations performed within our centre between January 2014 and July 2016. Patient demographics, sources of referral, transfer times, time to surgery and operative findings were obtained. Results A total of 100 patients underwent a scrotal exploration. Median age at presentation was 11 years (range 4 months to 15 years). Fifty-three per cent of referrals were from network hospitals. The median duration of symptoms was 25 hours (range 1-210 hours). The median transfer time from local centres was 120 minutes (range 45-540 minutes). The median time to theatre from the decision being made to operate was 60 minutes (range 30-600 minutes). Eighty-seven per cent of cases were explored within three hours. There were 13 cases of torsion with one orchidectomy. When taking into account the transfer time for external patients aged over five years without precluding comorbidities, exploration within three hours dropped to 18 of 46 (39%). Conclusion The RCS guidelines recognise the need for specialist input in very young patients. A large proportion of explorations are, however, currently taking place in older patients with unacceptably long transfer times. We propose an extension of this review nationally to work towards the local provision of care for suitable patients.

  15. MATIN: a random network coding based framework for high quality peer-to-peer live video streaming.

    PubMed

    Barekatain, Behrang; Khezrimotlagh, Dariush; Aizaini Maarof, Mohd; Ghaeini, Hamid Reza; Salleh, Shaharuddin; Quintana, Alfonso Ariza; Akbari, Behzad; Cabrera, Alicia Triviño

    2013-01-01

    In recent years, Random Network Coding (RNC) has emerged as a promising solution for efficient Peer-to-Peer (P2P) video multicasting over the Internet. This probably refers to this fact that RNC noticeably increases the error resiliency and throughput of the network. However, high transmission overhead arising from sending large coefficients vector as header has been the most important challenge of the RNC. Moreover, due to employing the Gauss-Jordan elimination method, considerable computational complexity can be imposed on peers in decoding the encoded blocks and checking linear dependency among the coefficients vectors. In order to address these challenges, this study introduces MATIN which is a random network coding based framework for efficient P2P video streaming. The MATIN includes a novel coefficients matrix generation method so that there is no linear dependency in the generated coefficients matrix. Using the proposed framework, each peer encapsulates one instead of n coefficients entries into the generated encoded packet which results in very low transmission overhead. It is also possible to obtain the inverted coefficients matrix using a bit number of simple arithmetic operations. In this regard, peers sustain very low computational complexities. As a result, the MATIN permits random network coding to be more efficient in P2P video streaming systems. The results obtained from simulation using OMNET++ show that it substantially outperforms the RNC which uses the Gauss-Jordan elimination method by providing better video quality on peers in terms of the four important performance metrics including video distortion, dependency distortion, End-to-End delay and Initial Startup delay.

  16. Overview of the new National Near-Road Air Quality Monitoring Network

    EPA Science Inventory

    In 2010, EPA promulgated new National Ambient Air Quality Standards (NAAQS) for nitrogen dioxide (NO2). As part of this new NAAQS, EPA required the establishment of a national near-road air quality monitoring network. This network will consist of one NO2 near-road monitoring st...

  17. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Burns, Richard D.; Davis, George; Cary, Everett; Higinbotham, John; Hogie, Keith

    2003-01-01

    A mission simulation prototype for Distributed Space Systems has been constructed using existing developmental hardware and software testbeds at NASA s Goddard Space Flight Center. A locally distributed ensemble of testbeds, connected through the local area network, operates in real time and demonstrates the potential to assess the impact of subsystem level modifications on system level performance and, ultimately, on the quality and quantity of the end product science data.

  18. Allergy medical care network: a new model of care for specialties.

    PubMed

    Ferré-Ybarz, L; Salinas Argente, R; Nevot Falcó, S; Gómez Galán, C; Franquesa Rabat, J; Trapé Pujol, J; Oliveras Alsina, P; Pons Serra, M; Corbella Virós, X

    2015-01-01

    In 2005 the Althaia Foundation Allergy Department performed its daily activity in the Hospital Sant Joan de Deu of Manresa. Given the increasing demand for allergy care, the department's performance was analysed and a strategic plan (SP) for 2005-2010 was designed. The main objective of the study was to assess the impact of the application of the SP on the department's operations and organisational level in terms of profitability, productivity and quality of care. Descriptive, retrospective study which evaluated the operation of the allergy department. The baseline situation was analysed and the SP was designed. Indicators were set to perform a comparative analysis after application of the SP. The indicators showed an increase in medical care activity (first visits, 34%; successive visits, 29%; day hospital treatments, 51%), high rates of resolution, reduced waiting lists. Economic analysis indicated an increase in direct costs justified by increased activity and territory attended. Cost optimisation was explained by improved patient accessibility, minimised absenteeism in the workplace and improved cost per visit. After application of the SP a networking system was established for the allergy speciality that has expanded the territory for which it provides care, increased total activity and the ability to resolve patients, optimised human resources, improved quality of care and streamlined medical costs. Copyright © 2013 SEICAP. Published by Elsevier Espana. All rights reserved.

  19. Non-invasive system for monitoring of the manufacturing equipment

    NASA Astrophysics Data System (ADS)

    Mazăre, A. G.; Belu, N.; Ionescu, L. M.; Rachieru, N.; Misztal, A.

    2017-08-01

    The automotive industry is one of the most important industries in the world that concerns the economy and the world culture. High demand has resulted in increasing of the pressure on the production lines. In conclusion, it is required more careful in monitoring of the production equipment not only for maintenance but also for staff safety and to increase the quality of production. In this paper, we propose a solution for non-invasive monitoring of the industrial equipment operation by measuring the current consumption on energy supply lines. Thus, it is determined the utilization schedule of the equipment and operation mode. Based on these measurements, it’s built an activity report for that equipment, available to the quality management and maintenance team. The solution consists of the current measuring equipment, with self-harvesting capabilities and radio transceiver, and an embedded system which run a server. The current measuring equipment will transmit data about consumption of each energy supply network line where is placed the industrial equipment. So, we have an internal measuring radio network. The embedded system will collect data for the equipment and put in a local data base and it will provide via an intranet application. The entire system not requires any supplementary energy supply and interventions in the factory infrastructure. It is experimented in a company from the automotive industries.

  20. Development of a networked four-million-pixel pathological and radiological digital image presentation system and its application to medical conferences

    NASA Astrophysics Data System (ADS)

    Sakano, Toshikazu; Furukawa, Isao; Okumura, Akira; Yamaguchi, Takahiro; Fujii, Tetsuro; Ono, Sadayasu; Suzuki, Junji; Matsuya, Shoji; Ishihara, Teruo

    2001-08-01

    The wide spread of digital technology in the medical field has led to a demand for the high-quality, high-speed, and user-friendly digital image presentation system in the daily medical conferences. To fulfill this demand, we developed a presentation system for radiological and pathological images. It is composed of a super-high-definition (SHD) imaging system, a radiological image database (R-DB), a pathological image database (P-DB), and the network interconnecting these three. The R-DB consists of a 270GB RAID, a database server workstation, and a film digitizer. The P-DB includes an optical microscope, a four-million-pixel digital camera, a 90GB RAID, and a database server workstation. A 100Mbps Ethernet LAN interconnects all the sub-systems. The Web-based system operation software was developed for easy operation. We installed the whole system in NTT East Kanto Hospital to evaluate it in the weekly case conferences. The SHD system could display digital full-color images of 2048 x 2048 pixels on a 28-inch CRT monitor. The doctors evaluated the image quality and size, and found them applicable to the actual medical diagnosis. They also appreciated short image switching time that contributed to smooth presentation. Thus, we confirmed that its characteristics met the requirements.

  1. Scoring sensor observations to facilitate the exchange of space surveillance data

    NASA Astrophysics Data System (ADS)

    Weigel, M.; Fiedler, H.; Schildknecht, T.

    2017-08-01

    In this paper, a scoring metric for space surveillance sensor observations is introduced. A scoring metric allows for direct comparison of data quantity and data quality, and makes transparent the effort made by different sensor operators. The concept might be applied to various sensor types like tracking and surveillance radar, active optical laser tracking, or passive optical telescopes as well as combinations of different measurement types. For each measurement type, a polynomial least squares fit is performed on the measurement values contained in the track. The track score is the average sum over the polynomial coefficients uncertainties and scaled by reference measurement accuracy. Based on the newly developed scoring metric, an accounting model and a rating model are introduced. Both models facilitate the exchange of observation data within a network of space surveillance sensors operators. In this paper, optical observations are taken as an example for analysis purposes, but both models can also be utilized for any other type of observations. The rating model has the capability to distinguish between network participants with major and minor data contribution to the network. The level of sanction on data reception is defined by the participants themselves enabling a high flexibility. The more elaborated accounting model translates the track score to credit points earned for data provision and spend for data reception. In this model, data reception is automatically limited for participants with low contribution to the network. The introduced method for observation scoring is first applied for transparent data exchange within the Small Aperture Robotic Telescope Network (SMARTnet). Therefore a detailed mathematical description is presented for line of sight measurements from optical telescopes, as well as numerical simulations for different network setups.

  2. Dynamic Online Bandwidth Adjustment Scheme Based on Kalai-Smorodinsky Bargaining Solution

    NASA Astrophysics Data System (ADS)

    Kim, Sungwook

    Virtual Private Network (VPN) is a cost effective method to provide integrated multimedia services. Usually heterogeneous multimedia data can be categorized into different types according to the required Quality of Service (QoS). Therefore, VPN should support the prioritization among different services. In order to support multiple types of services with different QoS requirements, efficient bandwidth management algorithms are important issues. In this paper, I employ the Kalai-Smorodinsky Bargaining Solution (KSBS) for the development of an adaptive bandwidth adjustment algorithm. In addition, to effectively manage the bandwidth in VPNs, the proposed control paradigm is realized in a dynamic online approach, which is practical for real network operations. The simulations show that the proposed scheme can significantly improve the system performances.

  3. Security of Quantum Repeater Network Operation

    DTIC Science & Technology

    2016-10-03

    readily in quantum networks than in classical networks. Our presentation at the SENT workshop attracted the attention of computer and network researchers...AFRL-AFOSR-JP-TR-2016-0079 Security of Quantum Repeater Network Operation Rodney Van Meter KEIO UNIVERSITY Final Report 10/03/2016 DISTRIBUTION A...To)  29 May 2014 to 28 May 2016 4. TITLE AND SUBTITLE Security of Quantum Repeater Network Operation 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA2386

  4. Impact of weak social ties and networks on poor sleep quality: A case study of Iranian employees.

    PubMed

    Masoudnia, Ebrahim

    2015-12-01

    The poor sleep quality is one of the major risk factors of somatic, psychiatric and social disorders and conditions as well as the major predictors of quality of employees' performance. The previous studies in Iran had neglected the impacts of social factors including social networks and ties on adults sleep quality. Thus, the aim of the current research was to determine the relationship between social networks and adult employees' sleep quality. This study was conducted with a correlational and descriptive design. Data were collected from 360 participants (183 males and 177 females) who were employed in Yazd public organizations in June and July of 2014. These samples were selected based on random sampling method. In addition, the measuring tools were the Pittsburgh Sleep Quality Index (PSQI) and Social Relations Inventory (SRI). Based on the results, the prevalence rate of sleep disorder among Iranian adult employees was 63.1% (total PSQI>5). And, after controlling for socio-demographic variables, there was significant difference between individuals with strong and poor social network and ties in terms of overall sleep quality (p<.01), subjective sleep quality (p<.01), habitual sleep efficiency (p<.05), and daytime dysfunction (p<.01). The results also revealed that the employees with strong social network and ties had better overall sleep quality, had the most habitual sleep efficiency, and less daytime dysfunction than employees with poor social network and ties. It can be implied that the weak social network and ties serve as a risk factor for sleep disorders or poor sleep quality for adult employees. Therefore, the social and behavioral interventions seem essential to improve the adult's quality sleep. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Network operating system

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Long-term and short-term objectives for the development of a network operating system for the Space Station are stated. The short-term objective is to develop a prototype network operating system for a 100 megabit/second fiber optic data bus. The long-term objective is to establish guidelines for writing a detailed specification for a Space Station network operating system. Major milestones are noted. Information is given in outline form.

  6. Quality assurance in transnational higher education: a case study of the tropEd network

    PubMed Central

    2013-01-01

    Introduction Transnational or cross-border higher education has rapidly expanded since the 1980s. Together with that expansion issues on quality assurance came to the forefront. This article aims to identify key issues regarding quality assurance of transnational higher education and discusses the quality assurance of the tropEd Network for International Health in Higher Education in relation to these key issues. Methods Literature review and review of documents. Results From the literature the following key issues regarding transnational quality assurance were identified and explored: comparability of quality assurance frameworks, true collaboration versus erosion of national education sovereignty, accreditation agencies and transparency. The tropEd network developed a transnational quality assurance framework for the network. The network accredits modules through a rigorous process which has been accepted by major stakeholders. This process was a participatory learning process and at the same time the process worked positive for the relations between the institutions. Discussion The development of the quality assurance framework and the process provides a potential example for others. PMID:23537108

  7. Further development of the EUMETNET Composite Observing System (EUCOS)

    NASA Astrophysics Data System (ADS)

    Klink, S.; Dibbern, J.

    2009-09-01

    EUCOS, which stands for EUMETNET Composite Observing System, is a EUMETNET programme whose main objective is a central management of surface based operational observations on a European-wide scale serving the needs of regional scale NWP. EUMETNET is a consortium of currently 26 national meteorological services in Europe that provides a framework for different operational and developmental co-operative programmes between the services. The work content of the EUCOS Programme includes the management of the operational observing networks, through the E-AMDAR, E-ASAP, E-SURFMAR and E-WINPROF programmes. The coordination of NMSs owned territorial networks (e.g. radiosonde stations and synoptic stations), data quality monitoring, fault reporting and recovery, a studies programme for the evolution of the observing networks and liaison with other organisations like WMO are among the tasks of the programme. The current period of the EUCOS programme has a five year duration (2007-2011) and a two stage approach was proposed in the programme definition. During the transition phase 2007-2008 no new programmatic objectives had been set because amongst others the Space-Terrestrial (S-T) study which investigated the relative contributions of selected space based and ground based observing systems to the forecast skill of global and regional NWP models had to be finalised first. Based on the findings of this study EUCOS currently prepares a redesign of its upper-air network. The original EUCOS upper-air network design was prepared in 2000 in order to define a set of stations serving the common general NWP requirement. Additional considerations were to make it possible to supply a common set of performance standards across the territory of EUMETNET Members and to ensure that the radiosonde network interleaved with AMDAR airports. The EUCOS upper-air network now requires a redesign because of several reasons. There is a need to take into account the significant evolution of the AMDAR network. Member states were not able to install the proposed EUCOS radiosonde network design with 4 ascents per day at most of the sites. The results from the S-T study are available with recommendations for the network design. Data assimilation of NWP models has improved significantly with advanced capability to make use of high time resolution data. The guidelines for the redesign of the EUCOS upper-air network will be derived from a study which is currently organised by EUCOS and conducted by ECMWF and several national Met. services. They contribute by running OSEs for different observation network setups with their model suites. The S-T study has shown that despite of all the additional new satellite observations, the degrading of the current terrestrial observing system to a basic (GUAN+GSN) network would have a significant negative impact on the forecast skill. The expected result from the envisaged OSEs is to find an optimum setting of upper-air measurements in space and time which maintains forecast skill. Throughout the second phase of the programme (2009-2011) the revised EUCOS design will be implemented. In the field of observation targeting EUCOS supported the PREVIEW Data Targeting (DTS) project. The main goal of this project was to develop and to assess the feasibility of an operational adaptive control of the operational observing system. The DTS project was lead by Met Office and co-funded by EUCOS and the European Commission (within the PREVIEW project). The main software, an interactive web-based tool, was developed by ECMWF and ran on their computer system during the trial phase which lasted from February until December 2008. During the trial the focus was on improving short range (1-3 days) forecasts of potentially high-impact and/or high-uncertainty weather events in Europe. Forecasters from all EUMETNET members had had the chance to submit sensitive area prediction requests on a daily basis. Afterwards the DTS displayed the sensitive areas calculated by ECMWF, Météo-France and Met Office and the lead user (an experienced forecaster) could then use the system to issue requests for additional, unscheduled observations. The trial has shown that a data targeting system can be routinely used. Targeted observations were successfully deployed from E-ASAP units, by the E-AMDAR programme and in 21 countries. 88% of the additionally requested radiosondes from land stations have been launched. Furthermore, the DTS was used to support research field campaigns like THORPEX-IPY, THORPEX-PARC and MEDEX. During the envisaged MEDEX Phase 2 campaign in autumn 2009, the DTS will be used as an operational tool to aid research. Further tasks for EUCOS will be the proposal and implementation of a new E-programme responsible for running a central data hub and centralised monitoring, setting of new objectives for the programme components E-ASAP, E-AMDAR, E-SURFMAR and E-WINPROF, and an extension of quality monitoring activities. An example for new programme objectives is the introduction of a humidity sensor on commercial aircraft within the E-AMDAR programme.

  8. National High Frequency Radar Network (hfrnet) and Pacific Research Efforts

    NASA Astrophysics Data System (ADS)

    Hazard, L.; Terrill, E. J.; Cook, T.; de Paolo, T.; Otero, M. P.; Rogowski, P.; Schramek, T. A.

    2016-12-01

    The U.S. High Frequency Radar Network (HFRNet) has been in operation for over ten years with representation from 31 organizations spanning academic institutions, state and local government agencies, and private organizations. HFRNet currently holds a collection from over 130 radar installations totaling over 10 million records of surface ocean velocity measurements. HFRNet is a primary example of inter-agency and inter-institutional partnerships for improving oceanographic research and operations. HF radar derived surface currents have been used in several societal applications including coastal search and rescue, oil spill response, water quality monitoring and marine navigation. Central to the operational success of the large scale network is an efficient data management, storage, access, and delivery system. The networking of surface current mapping systems is characterized by a tiered structure that extends from the individual field installations to local regional operations maintaining multiple sites and on to centralized locations aggregating data from all regions. The data system development effort focuses on building robust data communications from remote field locations (sites) for ingestion into the data system via data on-ramps (Portals or Site Aggregators) to centralized data repositories (Nodes). Centralized surface current data enables the aggregation of national surface current grids and allows for ingestion into displays, management tools, and models. The Coastal Observing Research and Development Center has been involved in international relationships and research in the Philippines, Palau, and Vietnam. CORDC extends this IT architecture of surface current mapping data systems leveraging existing developments and furthering standardization of data services for seamless integration of higher level applications. Collaborations include the Philippine Atmospheric Geophysical and Astronomical Services Administration (PAGASA), The Coral Reef Research Foundation (CRRF), and the Center for Oceanography/Vietnamese Administration of Seas and Islands (CFO/VASI). These collaborations and data sharing improve our abilities to respond to regional, national, and global environmental and management issues.

  9. Assessment of satellite communications quality study. Addendum 1: Impact of propagation delay on data transmission

    NASA Technical Reports Server (NTRS)

    Campanella, S. J.; Chitre, D. M.

    1988-01-01

    The single factor that irrevocably distinguishes geostationary satellite telephony transmission from terrestrial transmission is the greater propagation delay over satellite links. This difference has always provoked vigorous debate over the impact of delay on the subscribers using services incorporating satellite links. The issue is addressed from a variety of directions including human factors studies, laboratory subjective tests that evaluate delay with and without echo, and field tests that obtain data on the opinion of subscribers regarding the quality of service of operational circuits in both national U.S. domestic and international trans-Atlantic network. The tests involved the use of both echo suppressors and echo cancellers.

  10. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  11. Development of quality control and instrumentation performance metrics for diffuse optical spectroscopic imaging instruments in the multi-center clinical environment

    NASA Astrophysics Data System (ADS)

    Keene, Samuel T.; Cerussi, Albert E.; Warren, Robert V.; Hill, Brian; Roblyer, Darren; Leproux, AnaÑ--s.; Durkin, Amanda F.; O'Sullivan, Thomas D.; Haghany, Hosain; Mantulin, William W.; Tromberg, Bruce J.

    2013-03-01

    Instrument equivalence and quality control are critical elements of multi-center clinical trials. We currently have five identical Diffuse Optical Spectroscopic Imaging (DOSI) instruments enrolled in the American College of Radiology Imaging Network (ACRIN, #6691) trial located at five academic clinical research sites in the US. The goal of the study is to predict the response of breast tumors to neoadjuvant chemotherapy in 60 patients. In order to reliably compare DOSI measurements across different instruments, operators and sites, we must be confident that the data quality is comparable. We require objective and reliable methods for identifying, correcting, and rejecting low quality data. To achieve this goal, we developed and tested an automated quality control algorithm that rejects data points below the instrument noise floor, improves tissue optical property recovery, and outputs a detailed data quality report. Using a new protocol for obtaining dark-noise data, we applied the algorithm to ACRIN patient data and successfully improved the quality of recovered physiological data in some cases.

  12. Normal streamflows and water levels continue—Summary of hydrologic conditions in Georgia, 2014

    USGS Publications Warehouse

    Knaak, Andrew E.; Ankcorn, Paul D.; Peck, Michael F.

    2016-03-31

    The U.S. Geological Survey (USGS) South Atlantic Water Science Center (SAWSC) Georgia office, in cooperation with local, State, and other Federal agencies, maintains a long-term hydrologic monitoring network of more than 350 real-time, continuous-record, streamflow-gaging stations (streamgages). The network includes 14 real-time lake-level monitoring stations, 72 real-time surface-water-quality monitors, and several water-quality sampling programs. Additionally, the SAWSC Georgia office operates more than 204 groundwater monitoring wells, 39 of which are real-time. The wide-ranging coverage of streamflow, reservoir, and groundwater monitoring sites allows for a comprehensive view of hydrologic conditions across the State. One of the many benefits this monitoring network provides is a spatially distributed overview of the hydrologic conditions of creeks, rivers, reservoirs, and aquifers in Georgia.Streamflow and groundwater data are verified throughout the year by USGS hydrographers and made available to water-resource managers, recreationists, and Federal, State, and local agencies. Hydrologic conditions are determined by comparing the statistical analyses of data collected during the current water year to historical data. Changing hydrologic conditions underscore the need for accurate, timely data to allow informed decisions about the management and conservation of Georgia’s water resources for agricultural, recreational, ecological, and water-supply needs and in protecting life and property.

  13. The Korean Neonatal Network: An Overview

    PubMed Central

    Chang, Yun Sil; Park, Hyun-Young

    2015-01-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea. PMID:26566355

  14. Improved Hourly and Sub-Hourly Gauge Data for Assessing Precipitation Extremes in the U.S.

    NASA Astrophysics Data System (ADS)

    Lawrimore, J. H.; Wuertz, D.; Palecki, M. A.; Kim, D.; Stevens, S. E.; Leeper, R.; Korzeniewski, B.

    2017-12-01

    The NOAA/National Weather Service (NWS) Fischer-Porter (F&P) weighing bucket precipitation gauge network consists of approximately 2000 stations that comprise a subset of the NWS Cooperative Observers Program network. This network has operated since the mid-20th century, providing one of the longest records of hourly and 15-minute precipitation observations in the U.S. The lengthy record of this dataset combined with its relatively high spatial density, provides an important source of data for many hydrological applications including understanding trends and variability in the frequency and intensity of extreme precipitation events. In recent years NOAA's National Centers for Environmental Information initiated an upgrade of its end-to-end processing and quality control system for these data. This involved a change from a largely manual review and edit process to a fully automated system that removes the subjectivity that was previously a necessary part of dataset quality control and processing. An overview of improvements to this dataset is provided along with the results of an analysis of observed variability and trends in U.S. precipitation extremes since the mid-20th century. Multi-decadal trends in many parts of the nation are consistent with model projections of an increase in the frequency and intensity of heavy precipitation in a warming world.

  15. The Korean Neonatal Network: An Overview.

    PubMed

    Chang, Yun Sil; Park, Hyun-Young; Park, Won Soon

    2015-10-01

    Currently, in the Republic of Korea, despite the very-low-birth rate, the birth rate and number of preterm infants are markedly increasing. Neonatal deaths and major complications mostly occur in premature infants, especially very-low-birth-weight infants (VLBWIs). VLBWIs weigh less than 1,500 g at birth and require intensive treatment in a neonatal intensive care unit (NICU). The operation of the Korean Neonatal Network (KNN) officially started on April 15, 2013, by the Korean Society of Neonatology with support from the Korea Centers for Disease Control and Prevention. The KNN is a national multicenter neonatal network based on a prospective web-based registry for VLBWIs. About 2,000 VLBWIs from 60 participating hospital NICUs are registered annually in the KNN. The KNN has built unique systems such as a web-based real-time data display on the web site and a site-visit monitoring system for data quality surveillance. The KNN should be maintained and developed further in order to generate appropriate, population-based, data-driven, health-care policies; facilitate active multicenter neonatal research, including quality improvement of neonatal care; and ultimately lead to improvement in the prognosis of high-risk newborns and subsequent reduction in health-care costs through the development of evidence-based neonatal medicine in Korea.

  16. Quality of Information Approach to Improving Source Selection in Tactical Networks

    DTIC Science & Technology

    2017-02-01

    consider the performance of this process based on metrics relating to quality of information: accuracy, timeliness, completeness and reliability. These...that are indicators of that the network is meeting these quality requirements. We study effective data rate, social distance, link integrity and the...utility of information as metrics within a multi-genre network to determine the quality of information of its available sources. This paper proposes a

  17. A Network Approach to Curriculum Quality Assessment

    ERIC Educational Resources Information Center

    Jordens, J. Zoe; Zepke, Nick

    2009-01-01

    This paper argues for an alternative approach to quality assurance in New Zealand universities that locates evaluation not with external auditors but with members of the teaching team. In the process, aspects of network theories are introduced as the basis for an approach to quality assurance. From this, the concept of networks is extended to…

  18. The 'INMARSAT' international maritime satellite communication system

    NASA Astrophysics Data System (ADS)

    Atserov, Iu. S.

    1982-12-01

    The history, design, operating characteristics, achievements, and prospects of INMARSAT are discussed. More than 1300 ships are presently equipped to operate within the system, and this number is expected to rise to about 5000 by 1986. The principle of operation involves single coordinating earth stations allocating telephone channels in their zones between other earth stations. The messages reach a common signalling channel with which all ship stations keep in touch. The ship stations are connected to the international telex network. The INMARSAT system enables ships in the automated mode of operation to establish telephone and telegraph comunication with any subscriber on the shore of any country. The quality of the communication is practically independent of the distance between ship and shore at any time of year and under any meteorological conditions. Estimates indicate that the use of satellite communication with ships reduces losses from accidents by 10 percent per year.

  19. Physical impairment aware transparent optical networks

    NASA Astrophysics Data System (ADS)

    Antona, Jean-Christophe; Morea, Annalisa; Zami, Thierry; Leplingard, Florence

    2009-11-01

    As illustrated by optical fiber and optical amplification, optical telecommunications have appeared for the last ten years as one of the most promising candidates to increase the transmission capacities. More recently, the concept of optical transparency has been investigated and introduced: it consists of the optical routing of Wavelength Division Multiplexed (WDM) channels without systematic optoelectronic processing at nodes, as long as propagation impairments remain acceptable [1]. This allows achieving less power-consuming, more scalable and flexible networks, and today partial optical transparency has become a reality in deployed systems. However, because of the evolution of traffic features, optical networks are facing new challenges such as demand for higher transmitted capacity, further upgradeability, and more automation. Making all these evolutions compliant on the same current network infrastructure with a minimum of upgrades is one of the main issues for equipment vendors and operators. Hence, an automatic and efficient management of the network needs a control plan aware of the expected Quality of Transmission (QoT) of the connections to set-up with respect to numerous parameters such as: the services demanded by the customers in terms of protection/restoration; the modulation rate and format of the connection under test and also of its adjacent WDM channels; the engineering rules of the network elements traversed with an accurate knowledge of the associated physical impairments. Whatever the method and/or the technology used to collect this information, the issue about its accuracy is one of the main concerns of the network system vendors, because an inaccurate knowledge could yield a sub-optimal dimensioning and so additional costs when installing the network in the field. Previous studies [1], [2] illustrated the impact of this knowledge accuracy on the ability to predict the connection feasibility. After describing usual methods to build performance estimators, this paper reports on this impact but at the global network level, quantifying the importance to account for these uncertainties from the early network planning step; it also proposes an improvement of the accuracy of the Quality of Transmission (QoT) estimator to reduce the raise of planned resources due to these uncertainties.

  20. Assessment of the water quality monitoring network of the Piabanha River experimental watersheds in Rio de Janeiro, Brazil, using autoassociative neural networks.

    PubMed

    Villas-Boas, Mariana D; Olivera, Francisco; de Azevedo, Jose Paulo S

    2017-09-01

    Water quality monitoring is a complex issue that requires support tools in order to provide information for water resource management. Budget constraints as well as an inadequate water quality network design call for the development of evaluation tools to provide efficient water quality monitoring. For this purpose, a nonlinear principal component analysis (NLPCA) based on an autoassociative neural network was performed to assess the redundancy of the parameters and monitoring locations of the water quality network in the Piabanha River watershed. Oftentimes, a small number of variables contain the most relevant information, while the others add little or no interpretation to the variability of water quality. Principal component analysis (PCA) is widely used for this purpose. However, conventional PCA is not able to capture the nonlinearities of water quality data, while neural networks can represent those nonlinear relationships. The results presented in this work demonstrate that NLPCA performs better than PCA in the reconstruction of the water quality data of Piabanha watershed, explaining most of data variance. From the results of NLPCA, the most relevant water quality parameter is fecal coliforms (FCs) and the least relevant is chemical oxygen demand (COD). Regarding the monitoring locations, the most relevant is Poço Tarzan (PT) and the least is Parque Petrópolis (PP).

  1. Development of a district information system for water management planning and strategic decision making

    NASA Astrophysics Data System (ADS)

    Loukas, A.; Tzabiras, J.; Spiliotopoulos, M.; Kokkinos, K.; Fafoutis, C.; Mylopoulos, N.

    2015-06-01

    The overall objective of this work is the development of a District Information System (DIS) which could be used by stakeholders for the purposes of a district day-to-day water management as well as for planning and strategic decisionmaking. The DIS was developed from a GIS-based modeling approach, which integrates a generic crop model and a hydraulic model of the transport/distribution system, using land use maps generated by Landsat TM imagery. The main sub-objectives are: (i) the development of an operational algorithm to retrieve crop evapotranspiration from remote sensing data, (ii) the development of an information system with friendly user interface for the data base, the crop module and the hydraulic module and (iii) the analysis and validation of management scenarios from model simulations predicting the respective behavior. The Lake Karla watershed is used in this study, but the overall methodology could be used as a basis for future analysis elsewhere. Surface Energy Balance Algorithm for Land (SEBAL) was used to derive monthly actual evapotranspiration (ET) values from Landsat TM imagery. Meteorological data from the archives of the Institute for Research and Technology, Thessaly (I.RE.TE.TH) has also been used. The methodology was developed using high quality Landsat TM images during 2007 growing season. Monthly ET values are used as an input to CROPWAT model. Outputs of CROPWAT model are then used as input for WEAP model. The developed scenario is based on the actual situation of the surface irrigation network of the Local Administration of Land Reclamation (LALR) of Pinios for the year of 2007. The DIS is calibrated with observed data of this year and the district parameterization is conducted based on the actual operation of the network. The operation of the surface irrigation network of Pinios LALR is simulated using Technologismiki Works, while the operation of closed pipe irrigation network of Lake Karla LALR is simulated using Watercad. Four alternative scenarios have been tested with the DIS: reduction of channel losses, alteration of irrigation methods, Introduction of greenhouse cultivation, and operation of the future Lake Karla network. The results of the simulation for the historical period indicate that the water pumped from Pinios LALR is not enough to serve irrigation requirements. The spatial and temporal variation of the unmet and unsatisfied water demand has been estimated. Simulation of the four alternative scenarios indicated that the alteration of irrigation methods scenario mainly increases the efficiency of the irrigation network.

  2. Cascaded neural networks for sequenced propagation estimation, multiuser detection, and adaptive radio resource control of third-generation wireless networks for multimedia services

    NASA Astrophysics Data System (ADS)

    Hortos, William S.

    1999-03-01

    A hybrid neural network approach is presented to estimate radio propagation characteristics and multiuser interference and to evaluate their combined impact on throughput, latency and information loss in third-generation (3G) wireless networks. The latter three performance parameters influence the quality of service (QoS) for multimedia services under consideration for 3G networks. These networks, based on a hierarchical architecture of overlaying macrocells on top of micro- and picocells, are planned to operate in mobile urban and indoor environments with service demands emanating from circuit-switched, packet-switched and satellite-based traffic sources. Candidate radio interfaces for these networks employ a form of wideband CDMA in 5-MHz and wider-bandwidth channels, with possible asynchronous operation of the mobile subscribers. The proposed neural network (NN) architecture allocates network resources to optimize QoS metrics. Parameters of the radio propagation channel are estimated, followed by control of an adaptive antenna array at the base station to minimize interference, and then joint multiuser detection is performed at the base station receiver. These adaptive processing stages are implemented as a sequence of NN techniques that provide their estimates as inputs to a final- stage Kohonen self-organizing feature map (SOFM). The SOFM optimizes the allocation of available network resources to satisfy QoS requirements for variable-rate voice, data and video services. As the first stage of the sequence, a modified feed-forward multilayer perceptron NN is trained on the pilot signals of the mobile subscribers to estimate the parameters of shadowing, multipath fading and delays on the uplinks. A recurrent NN (RNN) forms the second stage to control base stations' adaptive antenna arrays to minimize intra-cell interference. The third stage is based on a Hopfield NN (HNN), modified to detect multiple users on the uplink radio channels to mitigate multiaccess interference, control carrier-sense multiple-access (CSMA) protocols, and refine call handoff procedures. In the final stage, the Kohonen SOFM, operating in a hybrid continuous and discrete space, adaptively allocates the resources of antenna-based cell sectorization, activity monitoring, variable-rate coding, power control, handoff and caller admission to meet user demands for various multimedia services at minimum QoS levels. The performance of the NN cascade is evaluated through simulation of a candidate 3G wireless network using W-CDMA parameters in a small-cell environment. The simulated network consists of a representative number of cells. Mobile users with typical movement patterns are assumed. QoS requirements for different classes of multimedia services are considered. The proposed method is shown to provide relatively low probability of new call blocking and handoff dropping, while maintaining efficient use of the network's radio resources.

  3. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  4. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  5. 47 CFR 51.311 - Nondiscriminatory access to unbundled network elements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... elements. 51.311 Section 51.311 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON... § 51.311 Nondiscriminatory access to unbundled network elements. (a) The quality of an unbundled network element, as well as the quality of the access to the unbundled network element, that an incumbent...

  6. Objective Speech Quality Assessment Based on Payload Discrimination of Lost Packets for Cellular Phones in NGN Environment

    NASA Astrophysics Data System (ADS)

    Uemura, Satoshi; Fukumoto, Norihiro; Yamada, Hideaki; Nakamura, Hajime

    A feature of services provided in a Next Generation Network (NGN) is that the end-to-end quality is guaranteed. This is quite a challenging issue, given the considerable fluctuation in network conditions within a Fixed Mobile Convergence (FMC) network. Therefore, a novel approach, whereby a network node and a mobile terminal such as a cellular phone cooperate with each other to control service quality is essential. In order to achieve such cooperation, the mobile terminal needs to become more intelligent so it can estimate the service quality, including the user's perceptual quality, and notify the measurement result to the network node. Subsequently, the network node implements some kind of service control function, such as a resource and admission control function, based on the notification from the mobile terminal. In this paper, the role of the mobile terminal in such collaborative system is focused on. As a part of a QoS/QoE measurement system, we describe an objective speech quality assessment with payload discrimination of lost packets to measure the user's perceptual quality of VoIP. The proposed assessment is so simple that it can be implemented on a cellular phone. We therefore did this as part of the QoS/QoE measurement system. By using the implemented system, we can measure the user's perceptual quality of VoIP as well as the network QoS metrics, in terms of criteria such as packet loss rate, jitter and burstiness in real time.

  7. Network Centric Warfare Case Study: U.S. V Corps and 3rd Infantry Division (Mechanized) During Operation Iraqi Freedom Combat Operations (Mar-Apr 2003). Volume 3. Network Centric Warfare Insights

    DTIC Science & Technology

    2003-01-01

    OPSEC), military deception, psychological operations (PSYOPS), special information operations (IO), information assurance, physical security...nonlethal effects, such as operational 8 Network Centric Warfare Case Study security (OPSEC), military deception, psychological operations (PSYOP...Support Operations Group ASR Alternate Supply Route; or, Ammunition Supply Rate ATACMS Army Tactical Missile System ATARS Advanced

  8. Development of Water Quality Forecasting Models Based on the SOM-ANN on TMDL Unit Watershed in Nakdong River

    NASA Astrophysics Data System (ADS)

    KIM, M.; Kim, J.; Baek, J.; Kim, C.; Shin, H.

    2013-12-01

    It has being happened as flush flood or red/green tide in various natural phenomena due to climate change and indiscreet development of river or land. Especially, water being very important to man should be protected and managed from water quality pollution, and in water resources management, real-time watershed monitoring system is being operated with the purpose of keeping watch and managing on rivers. It is especially important to monitor and forecast water quality in watershed. A study area selected Nak_K as one site among TMDL unit watershed in Nakdong River. This study is to develop a water quality forecasting model connected with making full use of observed data of 8 day interval from Nakdong River Environment Research Center. When forecasting models for each of the BOD, DO, COD, and chlorophyll-a are established considering correlation of various water quality factors, it is needed to select water quality factors showing highly considerable correlation with each water quality factor which is BOD, DO, COD, and chlorophyll-a. For analyzing the correlation of the factors (reservoir discharge, precipitation, air temperature, DO, BOD, COD, Tw, TN, TP, chlorophyll-a), in this study, self-organizing map was used and cross correlation analysis method was also used for comparing results drawn. Based on the results, each forecasting model for BOD, DO, COD, and chlorophyll-a was developed during the short period as 8, 16, 24, 32 days at 8 day interval. The each forecasting model is based on neural network with back propagation algorithm. That is, the study is connected with self-organizing map for analyzing correlation among various factors and neural network model for forecasting of water quality. It is considerably effective to manage the water quality in plenty of rivers, then, it specially is possible to monitor a variety of accidents in water quality. It will work well to protect water quality and to prevent destruction of the environment becoming more and more serious before occurring.

  9. Analysis of critical operating conditions for LV distribution networks with microgrids

    NASA Astrophysics Data System (ADS)

    Zehir, M. A.; Batman, A.; Sonmez, M. A.; Font, A.; Tsiamitros, D.; Stimoniaris, D.; Kollatou, T.; Bagriyanik, M.; Ozdemir, A.; Dialynas, E.

    2016-11-01

    Increase in the penetration of Distributed Generation (DG) in distribution networks, raises the risk of voltage limit violations while contributing to line losses. Especially in low voltage (LV) distribution networks (secondary distribution networks), impacts of active power flows on the bus voltages and on the network losses are more dominant. As network operators must meet regulatory limitations, they have to take into account the most critical operating conditions in their systems. In this study, it is aimed to present the impact of the worst operation cases of LV distribution networks comprising microgrids. Simulation studies are performed on a field data-based virtual test-bed. The simulations are repeated for several cases consisting different microgrid points of connection with different network loading and microgrid supply/demand conditions.

  10. Impact of peer-led quality improvement networks on quality of inpatient mental health care: study protocol for a cluster randomized controlled trial.

    PubMed

    Aimola, Lina; Jasim, Sarah; Tripathi, Neeraj; Tucker, Sarah; Worrall, Adrian; Quirk, Alan; Crawford, Mike J

    2016-09-21

    Quality improvement networks are peer-led programmes in which members of the network assess the quality of care colleagues provide according to agreed standards of practice. These networks aim to help members identify areas of service provision that could be improved and share good practice. Despite the widespread use of peer-led quality improvement networks, there is very little information about their impact. We are conducting a cluster randomized controlled trial of a quality improvement network for low-secure mental health wards to examine the impact of membership on the process and outcomes of care over a 12 month period. Standalone low secure units in England and Wales that expressed an interest in joining the quality improvement network were recruited for the study from 2012 to 2014. Thirty-eight units were randomly allocated to either the active intervention (participation in the network n = 18) or a control arm (delayed participation in the network n = 20). Using a 5 % significance level and 90 % power, it was calculated that a sample size of 60 wards was required taking into account a 10 % drop out. A total of 75 wards were assessed at baseline and 8 wards dropped out the study before the data collection at follow up. Researchers masked to the allocation status of the units assessed all study outcomes at baseline and follow-up 12 months later. The primary outcome is the quality of the physical environment and facilities on the wards. The secondary outcomes are: safety of the ward, patient-rated satisfaction with care and mental well-being, staff burnout, training and supervision. Relative to control wards, it is hypothesized that the quality of the physical environment and facilities will be higher on wards in the active arm of the trial 12 months after randomization. To our knowledge, this is the first randomized evaluation of a peer-led quality improvement network that has examined the impact of participation on both patient-level and service-level outcomes. The study has the potential to help shape future efforts to improve the quality of inpatient care. Current Controlled Trials ISRCTN79614916 . Retrospectively registered 28 March 2014].

  11. Automated Network Mapping and Topology Verification

    DTIC Science & Technology

    2016-06-01

    collection of information includes amplifying data about the networked devices such as hardware details, logical addressing schemes, 7 operating ...collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations ...maximum 200 words) The current military reliance on computer networks for operational missions and administrative duties makes network

  12. Trace saver: A tool for network service improvement and personalised analysis of user centric statistics

    NASA Astrophysics Data System (ADS)

    Bilal, Muhammad; Asfand-e-Yar, Mockford, Steve; Khan, Wasiq; Awan, Irfan

    2012-11-01

    Mobile technology is among the fastest growing technologies in today's world with low cost and highly effective benefits. Most important and entertaining areas in mobile technology development and usage are location based services, user friendly networked applications and gaming applications. However, concern towards network operator service provision and improvement has been very low. The portable applications available for a range of mobile operating systems which help improve the network operator services are desirable by the mobile operators. This paper proposes a state of the art mobile application Tracesaver, which provides a great achievement over the barriers in gathering device and network related information, for network operators to improve their network service provision. Tracesaver is available for a broad range of mobile devices with different mobile operating systems and computational capabilities. The availability of Tracesaver in market has proliferated over the last year since it was published. The survey and results show that Tracesaver is being used by millions of mobile users and provides novel ways of network service improvement with its highly user friendly interface.

  13. Analysis of straw row in the image to control the trajectory of the agricultural combine harvester

    NASA Astrophysics Data System (ADS)

    Shkanaev, Aleksandr Yurievich; Polevoy, Dmitry Valerevich; Panchenko, Aleksei Vladimirovich; Krokhina, Darya Alekseevna; Nailevish, Sadekov Rinat

    2018-04-01

    The paper proposes a solution to the automatic operation of the combine harvester along the straw rows by means of the images from the camera, installed in the cab of the harvester. The U-Net is used to recognize straw rows in the image. The edges of the row are approximated in the segmented image by the curved lines and further converted into the harvester coordinate system for the automatic operating system. The "new" network architecture and approaches to the row approximation has improved the quality of the recognition task and the processing speed of the frames up to 96% and 7.5 fps, respectively. Keywords: Grain harvester,

  14. Hunter New England Training (HNET): how to effect culture change in a psychiatry medical workforce.

    PubMed

    Cohen, Martin; Llewellyn, Anthony; Ditton-Phare, Philippa; Sandhu, Harsimrat; Vamos, Marina

    2011-12-01

    It is now recognized that education and training are at the core of quality systems in health care. In this paper we discuss the processes and drivers that underpinned the development of high quality education and training programs and placements for all junior doctors. The early identification and development of doctors interested in psychiatry as a career, engagement and co-operation with the broader junior doctor network and the creation of teaching opportunities for trainees that was linked to their stage of development were identified as key to the success of the program. Targeted, high quality education programs and clinical placements coupled with strategic development of workforce has reduced staff turn over, led to the stabilization of the medical workforce and created a culture where learning and supervision are highly valued.

  15. TCSC based filtering and improvement of power quality

    NASA Astrophysics Data System (ADS)

    Arulvendhan, K.; Dilli srinivasan, J.; Vinil, M.

    2018-04-01

    Thyristor Controlled Series Capacitor (TCSC) as a dynamic system, also its competence in growing power allocation in transmission lines, can be used to improve different power system problems. TCSC’s dissimilar advantages can be categorised as steady-state and transient ones. During a fault, TCSC can increase power quality by reducing the current and benefit to keep the voltage as high as conceivable. In this paper, the application of TCSC to enrich one of the vital power quality issues, i.e., voltage sag is investigated. Different operating modes of TCSC have dissimilar influences on the voltage of the bus that the line armed with TCSC is connected to. Relocating to bypass mode upon manifestation of a fault is a significant feature of TCSC to advance voltage sag. The simulations on a trial network disclose these facts.

  16. Sugar Influx Sensing by the Phosphotransferase System of Escherichia coli

    PubMed Central

    Somavanshi, Rahul; Ghosh, Bhaswar; Sourjik, Victor

    2016-01-01

    The phosphotransferase system (PTS) plays a pivotal role in the uptake of multiple sugars in Escherichia coli and many other bacteria. In the cell, individual sugar-specific PTS branches are interconnected through a series of phosphotransfer reactions, thus creating a global network that not only phosphorylates incoming sugars but also regulates a number of cellular processes. Despite the apparent importance of the PTS network in bacterial physiology, the holistic function of the network in the cell remains unclear. Here we used Förster resonance energy transfer (FRET) to investigate the PTS network in E. coli, including the dynamics of protein interactions and the processing of different stimuli and their transmission to the chemotaxis pathway. Our results demonstrate that despite the seeming complexity of the cellular PTS network, its core part operates in a strikingly simple way, sensing the overall influx of PTS sugars irrespective of the sugar identity and distributing this information equally through all studied branches of the network. Moreover, it also integrates several other specific metabolic inputs. The integrated output of the PTS network is then transmitted linearly to the chemotaxis pathway, in stark contrast to the amplification of conventional chemotactic stimuli. Finally, we observe that default uptake through the uninduced PTS network correlates well with the quality of the carbon source, apparently representing an optimal regulatory strategy. PMID:27557415

  17. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Lambotte, S.; Engels, F.

    2014-12-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.

  18. NIC atomic operation unit with caching and bandwidth mitigation

    DOEpatents

    Hemmert, Karl Scott; Underwood, Keith D.; Levenhagen, Michael J.

    2016-03-01

    A network interface controller atomic operation unit and a network interface control method comprising, in an atomic operation unit of a network interface controller, using a write-through cache and employing a rate-limiting functional unit.

  19. An X-protocol based medical teleconsultation system using low or high speed networks. A specific-design approach.

    PubMed Central

    Dueñas, A.; González, M. A.; Muñoz, A.; Salvador, C. H.

    1994-01-01

    The objective of this proposal is to provide solutions for the necessities of teleconsultation or telediagnosis among medical professionals, using work stations within the X-Windows environment and applicable in communication lines with an extensive range of bandwidths and operating system independence. Among the advantages sought are savings in transportation, improvement in the quality of the medical attention provided and continued training for the medical professional. Images Fig. 1 PMID:7949963

  20. Development of a demand assignment/TDMA system for international business satellite communications

    NASA Astrophysics Data System (ADS)

    Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo; Yamazaki, Yoshiharu

    An experimental IBS (international business satellite) communications system based on a demand assignment and TDMA (time-division multiple-access) operation has been developed. The system utilizes a limited satellite resource efficiently and provides various kinds of ISDN services totally. A discussion is presented of the IBS network configurations suitable to international communications and describes the developed communications system from the viewpoint of the hardware and software implementation. The performance in terms of the transmission quality and call processing is also demonstrated.

Top