Science.gov

Sample records for end-to-end outage minimization

  1. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    ERIC Educational Resources Information Center

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  2. End-to-End Commitment

    NASA Technical Reports Server (NTRS)

    Newcomb, John

    2004-01-01

    The end-to-end test would verify the complex sequence of events from lander separation to landing. Due to the large distances involved and the significant delay time in sending a command and receiving verification, the lander needed to operate autonomously after it separated from the orbiter. It had to sense conditions, make decisions, and act accordingly. We were flying into a relatively unknown set of conditions-a Martian atmosphere of unknown pressure, density, and consistency to land on a surface of unknown altitude, and one which had an unknown bearing strength.

  3. Minimizing forced outage risk in generator bidding

    NASA Astrophysics Data System (ADS)

    Das, Dibyendu

    Competition in power markets has exposed the participating companies to physical and financial uncertainties. Generator companies bid to supply power in a day-ahead market. Once their bids are accepted by the ISO they are bound to supply power. A random outage after acceptance of bids forces a generator to buy power from the expensive real-time hourly spot market and sell to the ISO at the set day-ahead market clearing price, incurring losses. A risk management technique is developed to assess this financial risk associated with forced outages of generators and then minimize it. This work presents a risk assessment module which measures the financial risk of generators bidding in an open market for different bidding scenarios. The day-ahead power market auction is modeled using a Unit Commitment algorithm and a combination of Normal and Cauchy distributions generate the real time hourly spot market. Risk profiles are derived and VaRs are calculated at 98 percent confidence level as a measure of financial risk. Risk Profiles and VaRs help the generators to analyze the forced outage risk and different factors affecting it. The VaRs and the estimated total earning for different bidding scenarios are used to develop a risk minimization module. This module will develop a bidding strategy of the generator company such that its estimated total earning is maximized keeping the VaR below a tolerable limit. This general framework of a risk management technique for the generating companies bidding in competitive day-ahead market can also help them in decisions related to building new generators.

  4. End-to-End Radiographic Systems Simulation

    SciTech Connect

    Mathews, A.; Kwan, T.; Buescher, K.; Snell, C.; Adams, K.

    1999-07-23

    This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop a validated end-to-end radiographic model that could be applied to both x-rays and protons. The specific objectives were to link hydrodynamic, transport, and magneto-hydrodynamic simulation software for purposes of modeling radiographic systems. In addition, optimization and analysis algorithms were to be developed to validate physical models and optimize the design of radiographic facilities.

  5. Measurements and analysis of end-to-end Internet dynamics

    SciTech Connect

    Paxson, V

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N{sup 2}) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  6. End-to-end power beaming model

    SciTech Connect

    Ponikvar, D.R.; Bell, J.P.; Schor, M.J.

    1994-12-31

    W.J. Schafer Associates, Inc. has produced an interactive end-to-end model of a laser power beaming system designed to deliver electrical power from a ground based free electron laser (FEL) to a satellite. The model includes a description of pertinent FEL physics, realistic atmospheric propagation effects, photovoltaic interactions for various semiconductor materials, and satellite onboard power conditioning. A detailed orbital model with graphical output is available, which visualizes the effect of electric propulsion for orbital reboost or orbit transfer. This flexible tool has been applied to specific examples of satellite battery charging for geostationary communication satellites, as well as parametric studies of photovoltaic cell performance. Preliminary results of system wavelength/power/aperture diameter trades will be presented.

  7. Evaluating Internet End-to-end Performance

    PubMed Central

    Wood, Fred B.; Cid, Victor H.; Siegel, Elliot R.

    1998-01-01

    Abstract Objective: An evaluation of Internet end-to-end performance was conducted for the purpose of better understanding the overall performance of Internet pathways typical of those used to access information in National Library of Medicine (NLM) databases and, by extension, other Internet-based biomedical information resources. Design: The evaluation used a three-level test strategy: 1) user testing to collect empirical data on Internet performance as perceived by users when accessing NLM Web-based databases, 2) technical testing to analyze the Internet paths between the NLM and the user's desktop computer terminal, and 3) technical testing between the NLM and the World Wide Web (“Web”) server computer at the user's institution to help characterize the relative performance of Internet pathways. Measurements: Time to download the front pages of NLM Web sites and conduct standardized searches of NLM databases, data transmission capacity between NLM and remote locations (known as the bulk transfer capacity [BTC], “ping” round-trip time as an indication of the latency of the network pathways, and the network routing of the data transmissions (number and sequencing of hops). Results: Based on 347 user tests spread over 16 locations, the median time per location to download the main NLM home page ranged from 2 to 59 seconds, and 1 to 24 seconds for the other NLM Web sites tested. The median time to conduct standardized searches and get search results ranged from 2 to 14 seconds for PubMed and 4 to 18 seconds for Internet Grateful Med. The overall problem rate was about 1 percent; that is, on the average, users experienced a problem once every 100 test measurements. The user terminal tests at five locations and Web host tests at 13 locations provided profiles of BTC, RTT, and network routing for both dial-up and fixed Internet connections. Conclusion: The evaluation framework provided a profile of typical Internet performance and insights into network

  8. Applying Trustworthy Computing to End-to-End Electronic Voting

    ERIC Educational Resources Information Center

    Fink, Russell A.

    2010-01-01

    "End-to-End (E2E)" voting systems provide cryptographic proof that the voter's intention is captured, cast, and tallied correctly. While E2E systems guarantee integrity independent of software, most E2E systems rely on software to provide confidentiality, availability, authentication, and access control; thus, end-to-end integrity is not…

  9. Minimize substation outage time by maximizing in-service testing

    SciTech Connect

    Lautenschlager, M. )

    1994-05-01

    Most substation maintenance work is based on fixed schedules rather than on known need. Scheduled maintenance is essential and cannot be eliminated entirely, but priority should be given to equipment known to be deteriorated or defective. It makes no sense to perform costly, scheduled outage maintenance work when other equipment is failing because of undetected defects. The following should be included as major elements in an energized testing program: visual inspections; infrared inspection; corona inspection; percent oxygen in gas samples drawn from nitrogen-blanketed transformers; percent total combustible gas in gas samples drawn from nitrogen-blanketed transformers; dissolved gas analysis; oil quality tests; free water in the sample; dissolved water in oil; dissolved metals-in-oil analysis; furfural concentration analysis; SF[sub 6] analysis; battery testing; the substation grounding grid; and protective relays. 4 figs.

  10. Standardizing an End-to-end Accounting Service

    NASA Technical Reports Server (NTRS)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  11. Plastic end-to-end treatment of bulbar urethral stricture

    PubMed Central

    Hamza, Amir; Behrendt, Wolf; Tietze, Stefan

    2013-01-01

    For bulbar urethral strictures up to 2.5 cm in length, the one-stage urethral plastic surgery with stricture excision and direct end-to-end anastomosis remains the best procedure to guarantee a high success rate. This retrospective review shows the results of 21 patients who underwent bulbar end-to-end anastomosis from 2010–2013. In 20 cases (95.3%) good results were archived. The criteria of success were identified by pre- and postoperative radiological diagnostics and uroflowmetry. PMID:26504704

  12. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  13. LWS/SET End-to-End Data System

    NASA Technical Reports Server (NTRS)

    Giffin, Geoff; Sherman, Barry; Colon, Gilberto (Technical Monitor)

    2002-01-01

    This paper describes the concept for the End-to-End Data System that will support NASA's Living With a Star Space Environment Testbed missions. NASA has initiated the Living With a Star (LWS) Program to develop a better scientific understanding to address the aspects of the connected Sun-Earth system that affect life and society. A principal goal of the program is to bridge the gap.between science, engineering, and user application communities. The Space Environment Testbed (SET) Project is one element of LWS. The Project will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The End-to-end data system allows investigators to access the SET control center, command their experiments, and receive data from their experiments back at their home facility, using the Internet. The logical functioning of major components of the end-to-end data system are described, including the GSFC Payload Operations Control Center (POCC), SET Payloads, the GSFC SET Simulation Lab, SET Experiment PI Facilities, and Host Systems. Host Spacecraft Operations Control Centers (SOCC) and the Host Spacecraft are essential links in the end-to-end data system, but are not directly under the control of the SET Project. Formal interfaces will be established between these entities and elements of the SET Project. The paper describes data flow through the system, from PI facilities connecting to the SET operations center via the Internet, communications to SET carriers and experiments via host systems, to telemetry returns to investigators from their flight experiments. It also outlines the techniques that will be used to meet mission requirements, while holding development and operational costs to a minimum. Additional information is included in the original extended abstract.

  14. Reconfigurable Protocol Sensing in an End-to-End Demonstration

    NASA Technical Reports Server (NTRS)

    Okino, Clayton M.; Gray, Andrew; Schoolcraft, Joshua

    2006-01-01

    In this work, we present sensing performance using an architecture for a reconfigurable protocol chip for spacebased applications. Toward utilizing the IP packet architecture, utilizing data link layer framing structures for multiplexed data on a channel are the targeted application considered for demonstration purposes. Specifically, we examine three common framing standards and present the sensing performance of these standards and their relative de-correlation metrics. Some analysis is performed to investigate the impact of lossy links. Finally, we present results on a demonstration platform that integrated reconfigurable sensing technology into the Ground Station Interface Device (GRID) for End-to-End IP demonstrations in space.

  15. On routing algorithms with end-to-end delay guarantees

    SciTech Connect

    Rao, N.S.V.; Batsell, S.G.

    1998-11-01

    The authors consider the transmission of a message of size r from a source to a destination with guarantees on the end-to-end delay over a computer network with n nodes and m links. There are three sources of delays: (a) propagation delays along the links, (b) delays due to bandwidth availability on the links, and (c) queuing delays at the intermediate nodes. First, the authors consider that delays on various links and nodes are given as functions of the message size. If the delay in (b) is a non-increasing function of the bandwidth, they propose O(m{sup 2} + mn log n) time algorithm to compute a path with the minimum end-to-end delay for any given message size r. They then consider that the queuing delay in (c) is a random variable correlated with the message size according to an unknown distribution. At each node, the measurements of queuing delays and message sizes are available. They propose two algorithms to compute paths whose delays are close to optimal delays with a high probability, irrespective of the distribution of the delays, and based entirely on the measurements of sufficient size.

  16. End-to-end network/application performance troubleshooting methodology

    SciTech Connect

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  17. Recirculating Linac Acceleration - End-to-End Simulation

    SciTech Connect

    Alex Bogacz

    2010-03-01

    A conceptual design of a high-pass-number Recirculating Linear Accelerator (RLA) for muons is presented. The scheme involves three superconducting linacs (201 MHz): a single pass linear Pre-accelerator followed by a pair multi-pass (4.5-pass) 'Dogbone' RLAs. Acceleration starts after ionization cooling at 220 MeV/c and proceeds to 12.6 GeV. The Pre-accelerator captures a large muon phase space and accelerates muons to relativistic energies, while adiabatically decreasing the phase-space volume, so that effective acceleration in the RLA is possible. The RLA further compresses and shapes up the longitudinal and transverse phase-spaces, while increasing the energy. Appropriate choice of multi-pass linac optics based on FODO focusing assures large number of passes in the RLA. The proposed 'Dogbone' configuration facilitates simultaneous acceleration of both mu± species through the requirement of mirror symmetric optics of the return 'droplet' arcs. Finally, presented end-to-end simulation validates the efficiency and acceptance of the accelerator system.

  18. Key management for large scale end-to-end encryption

    SciTech Connect

    Witzke, E.L.

    1994-07-01

    Symmetric end-to-end encryption requires separate keys for each pair of communicating confidants. This is a problem of Order N{sup 2}. Other factors, such as multiple sessions per pair of confidants and multiple encryption points in the ISO Reference Model complicate key management by linear factors. Public-key encryption can reduce the number of keys managed to a linear problem which is good for scaleability of key management, but comes with complicating issues and performance penalties. Authenticity is the primary ingredient of key management. If each potential pair of communicating confidants can authenticate data from each other, then any number of public encryption keys of any type can be communicated with requisite integrity. These public encryption keys can be used with the corresponding private keys to exchange symmetric cryptovariables for high data rate privacy protection. The Digital Signature Standard (DSS), which has been adopted by the United States Government, has both public and private components, similar to a public-key cryptosystem. The Digital Signature Algorithm of the DSS is intended for authenticity but not for secrecy. In this paper, the authors will show how the use of the Digital Signature Algorithm combined with both symmetric and asymmetric (public-key) encryption techniques can provide a practical solution to key management scaleability problems, by reducing the key management complexity to a problem of order N, without sacrificing the encryption speed necessary to operate in high performance networks.

  19. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  20. OGC standards for end-to-end sensor network integration

    NASA Astrophysics Data System (ADS)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  1. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    NASA Astrophysics Data System (ADS)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    value. However, this cost was minimal when local conservation actions were part of a concerted coast-wide plan. The simulations demonstrate the utility of using the Atlantis end-to-end ecosystem model within NOAA’s Integrated Ecosystem Assessment, by illustrating an end-to-end modeling tool that allows consideration of multiple management alternatives that are relevant to numerous state, federal and private interests.

  2. Quality metrics for measuring end-to-end distortion in packet-switched video communication systems

    NASA Astrophysics Data System (ADS)

    Eisenberg, Yiftach; Zhai, Fan; Pappas, Thrasyvoulos N.; Berry, Randall; Katsaggelos, Aggelos K.

    2004-06-01

    A critical component of any video transmission system is an objective metric for evaluating the quality of the video signal as it is seen by the end-user. In packet-based communication systems, such as a wireless channel or the Internet, the quality of the received signal is affected by both signal compression and packet losses. Due to the probabilistic nature of the channel, the distortion in the reconstructed signal is a random variable. In addition, the quality of the reconstructed signal depends on the error concealment strategy. A common approach is to use the expected mean squared error of the end-to-end distortion as the performance metric. It can be shown that this approach leads to unpredictable perceptual artifacts. A better approach is to account for both the mean and the variance of the end-to-end distortion. We explore the perceptual benefits of this approach. By accounting for the variance of the distortion, the difference between the transmitted and the reconstructed signal can be decreased without a significant increase in the expected value of the distortion. Our experimental results indicate that for low to moderate probability of loss, the proposed approach offers significant advantages over strictly minimizing the expected distortion. We demonstrate that controlling the variance of the distortion limits perceptually annoying artifacts such as persistent errors.

  3. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  4. Resolving piping analysis issues to minimize impact on installation activities during refueling outage at nuclear power plants

    SciTech Connect

    Bhavnani, D.

    1996-12-01

    While it is required to maintain piping code compliance for all phases of installation activities during outages at a nuclear plant, it is equally essential to reduce challenges to the installation personnel on how plant modification work should be performed. Plant betterment activities that incorporate proposed design changes are continually implemented during the outages. Supporting analysis are performed to back these activities for operable systems. The goal is to reduce engineering and craft man-hours and minimize outage time. This paper outlines how plant modification process can be streamlined to facilitate construction teams to do their tasks that involve safety related piping. In this manner, installation can proceed by minimizing on the spot analytical effort and reduce downtime to support the proposed modifications. Examples are provided that permit performance of installation work in any sequence. Piping and hangers including the branch lines are prequalified and determined operable. The system is up front analyzed for all possible scenarios. The modification instructions in the work packages is flexible enough to permit any possible installation sequence. The benefit to this approach is large enough in the sense that valuable outage time is not extended and on site analytical work is not required.

  5. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  6. A Computer Program for the Distribution of End-to-End Distances in Polymer Molecules

    ERIC Educational Resources Information Center

    Doorne, William Van; And Others

    1976-01-01

    Describes a Fortran program that illustrates how the end-to-end distances in randomly coiled polymer molecules is affected by varying the number and lengths of chains and the angles between them. (MLH)

  7. An end-to-end communications architecture for condition-based maintenance applications

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  8. End-to-end Coronagraphic Modeling Including a Low-order Wavefront Sensor

    NASA Technical Reports Server (NTRS)

    Krist, John E.; Trauger, John T.; Unwin, Stephen C.; Traub, Wesley A.

    2012-01-01

    To evaluate space-based coronagraphic techniques, end-to-end modeling is necessary to simulate realistic fields containing speckles caused by wavefront errors. Real systems will suffer from pointing errors and thermal and motioninduced mechanical stresses that introduce time-variable wavefront aberrations that can reduce the field contrast. A loworder wavefront sensor (LOWFS) is needed to measure these changes at a sufficiently high rate to maintain the contrast level during observations. We implement here a LOWFS and corresponding low-order wavefront control subsystem (LOWFCS) in end-to-end models of a space-based coronagraph. Our goal is to be able to accurately duplicate the effect of the LOWFS+LOWFCS without explicitly evaluating the end-to-end model at numerous time steps.

  9. End-To-END Performance of the Future MOMA Instrument Aboard the ExoMars Mission

    NASA Astrophysics Data System (ADS)

    Pinnick, V. T.; Buch, A.; Szopa, C.; Grand, N.; Danell, R.; Grubisic, A.; van Amerom, F. H. W.; Glavin, D. P.; Freissinet, C.; Coll, P. J.; Stalport, F.; Humeau, O.; Arevalo, R. D., Jr.; Brinckerhoff, W. B.; Steininger, H.; Goesmann, F.; Raulin, F.; Mahaffy, P. R.

    2015-12-01

    Following the SAM experiment aboard the Curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the 2018 ExoMars mission will be the continuation of the search for organic matter on the Mars surface. One advancement with the ExoMars mission is that the sample will be extracted as deep as 2 meters below the Martian surface to minimize effects of radiation and oxidation on organic materials. To analyze the wide range of organic composition (volatile and non-volatile compounds) of the Martian soil, MOMA is equipped with a dual ion source ion trap mass spectrometer utilizing UV laser desorption / ionization (LDI) and pyrolysis gas chromatography (pyr-GC). In order to analyze refractory organic compounds and chiral molecules during GC-ITMS analysis, samples may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). Previous experimental reports have focused on coupling campaigns between the breadboard versions of the GC, provided by the French team (LISA, LATMOS, CentraleSupelec), and the MS, provided by the US team (NASA-GSFC). This work focuses on the performance verification and optimization of the GC-ITMS experiment using the Engineering Test Unit (ETU) models which are representative of the form, fit and function of the flight instrument including a flight-like pyrolysis oven and tapping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References: [1] Buch, A. et al. (2009) J Chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459.

  10. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  11. Scalable end-to-end encryption technology for supra-gigabit/second networking

    SciTech Connect

    Pierson, L.G.; Tarman, T.D.; Witzke, E.L.

    1997-05-01

    End-to-end encryption can protect proprietary information as it passes through a complex inter-city computer network, even if the intermediate systems are untrusted. This technique involves encrypting the body of computer messages while leaving network addressing and control information unencrypted for processing by intermediate network nodes. Because high speed implementations of end-to-end encryption with easy key management for standard network protocols are unavailable, this technique is not widely used today. Specifically, no end-to-end encryptors exist to protect Asynchronous Transfer Mode (ATM) traffic, nor to protect Switched Multi-megabit Data Service (SMDS), which is the first ``Broadband Integrated Services Digital Network`` (BISDN) service now being used by long distance telephone companies. This encryption technology is required for the protection of data in transit between industrial sites and central Massively Parallel Supercomputing Centers over high bandwidth, variable bit rate (BISDN) services. This research effort investigated techniques to scale end-to-end encryption technology from today`s state of the art ({approximately} 0.001 Gb/s) to 2.4 Gb/s and higher. A cryptosystem design has been developed which scales for implementation beyond SONET OC-48 (2.4Gb/s) data rates. A prototype for use with OC-3 (0.155 Gb/s) ATM variable bit rate services was developed.

  12. End-to-End Data System Architecture for the Space Station Biological Research Project

    NASA Technical Reports Server (NTRS)

    Mian, Arshad; Scimemi, Sam; Adeni, Kaiser; Picinich, Lou; Ramos, Rubin (Technical Monitor)

    1998-01-01

    The Space Station Biological Research Project (SSBRP) Is developing hardware referred to as the "facility" for providing life sciences research capability on the International Space Station. This hardware includes several biological specimen habitats, habitat holding racks, a centrifuge and a glovebox. An SSBRP end to end data system architecture has been developed to allow command and control of the facility from the ground, either with crew assistance or autonomously. The data system will be capable of handling commands, sensor data, and video from multiple cameras. The data will traverse through several onboard and ground networks and processing entities including the SSBRP and Space Station onboard and ground data systems. A large number of onboard and ground (,entities of the data system are being developed by the Space Station Program, other NASA centers and the International Partners. The SSBRP part of the system which includes the habitats, holding racks, and the ground operations center, User Operations Facility (UOF) will be developed by a multitude of geographically distributed development organizations. The SSBRP has the responsibility to define the end to end data and communications systems to make the interfaces manageable and verifiable with multiple contractors with widely varying development constraints and schedules. This paper provides an overview of the SSBRP end-to-end data system. Specifically, it describes the hardware, software and functional interactions of individual systems, and interface requirements among various entities of the end-to-end system.

  13. End-to-end network models encompassing terrestrial, wireless, and satellite components

    NASA Astrophysics Data System (ADS)

    Boyarko, Chandler L.; Britton, John S.; Flores, Phil E.; Lambert, Charles B.; Pendzick, John M.; Ryan, Christopher M.; Shankman, Gordon L.; Williams, Ramon P.

    2004-08-01

    Development of network models that reflect true end-to-end architectures such as the Transformational Communications Architecture need to encompass terrestrial, wireless and satellite component to truly represent all of the complexities in a world wide communications network. Use of best-in-class tools including OPNET, Satellite Tool Kit (STK), Popkin System Architect and their well known XML-friendly definitions, such as OPNET Modeler's Data Type Description (DTD), or socket-based data transfer modules, such as STK/Connect, enable the sharing of data between applications for more rapid development of end-to-end system architectures and a more complete system design. By sharing the results of and integrating best-in-class tools we are able to (1) promote sharing of data, (2) enhance the fidelity of our results and (3) allow network and application performance to be viewed in the context of the entire enterprise and its processes.

  14. The International Space Station Alpha (ISSA) End-to-End On-Orbit Maintenance Process Flow

    NASA Technical Reports Server (NTRS)

    Zingrebe, Kenneth W., II

    1995-01-01

    As a tool for construction and refinement of the on-orbit maintenance system to sustain the International Space Station Alpha (ISSA), the Mission Operations Directorate (MOD) developed an end to-end on-orbit maintenance process flow. This paper discusses and demonstrates that process flow. This tool is being used by MOD to identify areas which require further work in preparation for MOD's role in the conduct of on-orbit maintenance operations.

  15. Surgical Outcome of Excision and End-to-End Anastomosis for Bulbar Urethral Stricture

    PubMed Central

    Suh, Jun-Gyo; Choi, Woo Suk; Paick, Jae-Seung

    2013-01-01

    Purpose Although direct-vision internal urethrotomy can be performed for the management of short, bulbar urethral strictures, excision and end-to-end anastomosis remains the best procedure to guarantee a high success rate. We performed a retrospective evaluation of patients who underwent bulbar end-to-end anastomosis to assess the factors affecting surgical outcome. Materials and Methods We reviewed 33 patients with an average age of 55 years who underwent bulbar end-to-end anastomosis. Stricture etiology was blunt perineal trauma (54.6%), iatrogenic (24.2%), idiopathic (12.1%), and infection (9.1%). A total of 21 patients (63.6%) underwent urethrotomy, dilation, or multiple treatments before referral to our center. Clinical outcome was considered a treatment failure when any postoperative instrumentation was needed. Results Mean operation time was 151 minutes (range, 100 to 215 minutes) and mean excised stricture length was 1.5 cm (range, 0.8 to 2.3 cm). At a mean follow-up of 42.6 months (range, 8 to 96 months), 29 patients (87.9%) were symptom-free and required no further procedure. Strictures recurred in 4 patients (12.1%) within 5 months after surgery. Of four recurrences, one patient was managed successfully by urethrotomy, whereas the remaining three did not respond to urethrotomy or dilation and required additional urethroplasty. The recurrence rate was significantly higher in the patients with nontraumatic causes (iatrogenic in three, infection in one patient) than in the patients with traumatic etiology. Conclusions Excision and end-to-end anastomosis for short, bulbar urethral stricture has an acceptable success rate of 87.9%. However, careful consideration is needed to decide on the surgical procedure if the stricture etiology is nontraumatic. PMID:23878686

  16. End-to-end calculation of the radiation characteristics of VVER-1000 spent fuel assemblies

    NASA Astrophysics Data System (ADS)

    Linge, I. I.; Mitenkova, E. F.; Novikov, N. V.

    2012-12-01

    The results of end-to-end calculation of the radiation characteristics of VVER-1000 spent nuclear fuel are presented. Details of formation of neutron and gamma-radiation sources are analyzed. Distributed sources of different types of radiation are considered. A comparative analysis of calculated radiation characteristics is performed with the use of nuclear data from different ENDF/B and EAF files and ANSI/ANS and ICRP standards.

  17. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    NASA Astrophysics Data System (ADS)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  18. An end-to-end approach to developing biological and chemical detector requirements

    NASA Astrophysics Data System (ADS)

    Teclemariam, Nerayo P.; Purvis, Liston K.; Foltz, Greg W.; West, Todd; Edwards, Donna M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    2009-05-01

    Effective defense against chemical and biological threats requires an "end-to-end" strategy that encompasses the entire problem space, from threat assessment and target hardening to response planning and recovery. A key element of the strategy is the definition of appropriate system requirements for surveillance and detection of threat agents. Our end-to-end approach to venue chem/bio defense is captured in the Facilities Weapons of Mass Destruction Decision Analysis Capability (FacDAC), an integrated system-of-systems toolset that can be used to generate requirements across all stages of detector development. For example, in the early stage of detector development the approach can be used to develop performance targets (e.g., sensitivity, selectivity, false positive rate) to provide guidance on what technologies to pursue. In the development phase, after a detector technology has been selected, the approach can aid in determining performance trade-offs and down-selection of competing technologies. During the application stage, the approach can be employed to design optimal defensive architectures that make the best use of available technology to maximize system performance. This presentation will discuss the end-to-end approach to defining detector requirements and demonstrate the capabilities of the FacDAC toolset using examples from a number of studies for the Department of Homeland Security.

  19. End-to-end distribution for a wormlike chain in arbitrary dimensions.

    PubMed

    Mehraeen, Shafigh; Sudhanshu, Bariz; Koslover, Elena F; Spakowitz, Andrew J

    2008-06-01

    We construct an efficient methodology for calculating wormlike chain statistics in arbitrary D dimensions over all chain rigidities, from fully rigid to completely flexible. The structure of our exact analytical solution for the end-to-end distribution function for a wormlike chain in arbitrary D dimensions in Fourier-Laplace space (i.e., Fourier-transformed end position and Laplace-transformed chain length) adopts the form of an infinite continued fraction, which is advantageous for its compact structure and stability for numerical implementation. We then proceed to present a step-by-step methodology for performing the Fourier-Laplace inversion in order to make full use of our results in general applications. Asymptotic methods for evaluating the Laplace inversion (power-law expansion and Rayleigh-Schrödinger perturbation theory) are employed in order to improve the accuracy of the numerical inversions of the end-to-end distribution function in real space. We adapt our results to the evaluation of the single-chain structure factor, rendering simple, closed-form expressions that facilitate comparison with scattering experiments. Using our techniques, the accuracy of the end-to-end distribution function is enhanced up to the limit of the machine precision. We demonstrate the utility of our methodology with realizations of the chain statistics, giving a general methodology that can be applied to a wide range of biophysical problems. PMID:18643291

  20. End-to-end distribution for a wormlike chain in arbitrary dimensions

    NASA Astrophysics Data System (ADS)

    Mehraeen, Shafigh; Sudhanshu, Bariz; Koslover, Elena F.; Spakowitz, Andrew J.

    2008-06-01

    We construct an efficient methodology for calculating wormlike chain statistics in arbitrary D dimensions over all chain rigidities, from fully rigid to completely flexible. The structure of our exact analytical solution for the end-to-end distribution function for a wormlike chain in arbitrary D dimensions in Fourier-Laplace space (i.e., Fourier-transformed end position and Laplace-transformed chain length) adopts the form of an infinite continued fraction, which is advantageous for its compact structure and stability for numerical implementation. We then proceed to present a step-by-step methodology for performing the Fourier-Laplace inversion in order to make full use of our results in general applications. Asymptotic methods for evaluating the Laplace inversion (power-law expansion and Rayleigh-Schrödinger perturbation theory) are employed in order to improve the accuracy of the numerical inversions of the end-to-end distribution function in real space. We adapt our results to the evaluation of the single-chain structure factor, rendering simple, closed-form expressions that facilitate comparison with scattering experiments. Using our techniques, the accuracy of the end-to-end distribution function is enhanced up to the limit of the machine precision. We demonstrate the utility of our methodology with realizations of the chain statistics, giving a general methodology that can be applied to a wide range of biophysical problems.

  1. End-to-end security in telemedical networks--a practical guideline.

    PubMed

    Wozak, Florian; Schabetsberger, Thomas; Ammmenwerth, Elske

    2007-01-01

    The interconnection of medical networks in different healthcare institutions will be constantly increasing over the next few years, which will require concepts for securing medical data during transfer, since transmitting patient related data via potentially insecure public networks is considered a violation of data privacy. The aim of our work was to develop a model-based approach towards end-to-end security which is defined as continuous security from point of origin to point of destination in a communication process. We show that end-to-end security must be seen as a holistic security concept, which comprises the following three major parts: authentication and access control, transport security, as well as system security. For integration into existing security infrastructures abuse case models were used, which extend UML use cases, by elements necessary to describe abusive interactions. Abuse case models can be constructed for each part mentioned above, allowing for potential security risks in communication from point of origin to point of destination to be identified and counteractive measures to be directly derived from the abuse case models. The model-based approach is a guideline to continuous risk assessment and improvement of end-to-end security in medical networks. Validity and relevance to practice will be systematically evaluated using close-to-reality test networks as well as in production environments. PMID:17097916

  2. End-to-end modeling of the ozone mapping and profiler suite

    NASA Astrophysics Data System (ADS)

    McComas, Brian K.; Seftor, Colin; Remund, Quinn; Larsen, Jack; Wright, Carter; Raine, Erica

    2004-09-01

    The Ozone and Mapping Profiler Suite (OMPS) is an instrument suite in the National Polar-orbiting Operation Environmental Satellite System (NPOESS). The OMPS instrument is designed to globally retrieve both total column ozone and ozone profiles. To do this, OMPS consists of three sensors, two Nadir Instruments and one Limb Instrument. Each OMPS sensor has an End-to-End Model (ETEM) developed using the Toolkit for Remote Sensing, Analysis, Design, Evaluation, and Simulation (TRADES), a Ball Aerospace proprietary set of software tools developed in Matlab. The end-to-end modeling activities, which includes a radiative transfer model, the ETEM, and retrieval algorithms, have three fundamental objectives: sensor performance validation, aid in algorithm development, and algorithm robustness validation. The end-to-end modeling activities are key to showing sensor performance meets the system level Environmental Data Record (EDR) requirements. To do this, the ETEM incorporates sensor data; including point spread functions, stray light, dispersion, bandpass, and focal plane array (FPA) noise parameters. The sensor model characteristics are first implemented with predictions and updated as component test data becomes available. To evaluate the system"s EDR performance, the input radiance derived from the radiative transfer model is entered into the ETEM, which outputs a simulated image. The retrieval algorithms process the simulated image to determine the ozone amount. The system level EDR performance is determined by comparing the retrieved ozone amount with the truth, which was entered into the forward model. Additionally, the ETEM aids the algorithm development by simulating the expected sensor and calibration data with the expected noise characteristics. Finally, the algorithm robustness can be validated against extreme conditions using the ETEM.

  3. End-to-end planning and scheduling systems technology for space operations

    NASA Astrophysics Data System (ADS)

    Moe, Karen L.

    1992-08-01

    Consideration is given to planning and scheduling operations concepts from an end-to-end perspective, through both mission operations and institutional support functions. An operations concept is proposed which is based on a flexible request language used to state resource requirements and mission constraints to a scheduling system. The language has the potential to evolve into an international standard for exchanging service request information on international space networks. The key benefit of the flexible scheduling request concept is the shift of a significant conflict resolution effort from humans to computers, reducing the time for generating a week's worth of schedules to hours instead of days.

  4. End-to-end planning and scheduling systems technology for space operations

    NASA Technical Reports Server (NTRS)

    Moe, Karen L.

    1992-01-01

    Consideration is given to planning and scheduling operations concepts from an end-to-end perspective, through both mission operations and institutional support functions. An operations concept is proposed which is based on a flexible request language used to state resource requirements and mission constraints to a scheduling system. The language has the potential to evolve into an international standard for exchanging service request information on international space networks. The key benefit of the flexible scheduling request concept is the shift of a significant conflict resolution effort from humans to computers, reducing the time for generating a week's worth of schedules to hours instead of days.

  5. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  6. Information adaptive system of NEEDS. [of NASA End to End Data System

    NASA Technical Reports Server (NTRS)

    Howle, W. M., Jr.; Kelly, W. L.

    1979-01-01

    The NASA End-to-End Data System (NEEDS) program was initiated by NASA to improve significantly the state of the art in acquisition, processing, and distribution of space-acquired data for the mid-1980s and beyond. The information adaptive system (IAS) is a program element under NEEDS Phase II which addresses sensor specific processing on board the spacecraft. The IAS program is a logical first step toward smart sensors, and IAS developments - particularly the system components and key technology improvements - are applicable to future smart efforts. The paper describes the design goals and functional elements of the IAS. In addition, the schedule for IAS development and demonstration is discussed.

  7. EFFIS: and End-to-end Framework for Fusion Integrated Simulation

    SciTech Connect

    Cummings, Julian; Schwan, Karsten; Sim, Alexander S; Shoshani, Arie; Docan, Ciprian; Parashar, Manish; Klasky, Scott A; Podhorszki, Norbert

    2010-01-01

    The purpose of the Fusion Simulation Project is to develop a predictive capability for integrated modeling of magnetically confined burning plasmas. In support of this mission, the Center for Plasma Edge Simulation has developed an End-to-end Framework for Fusion Integrated Simulation (EFFIS) that combines critical computer science technologies in an effective manner to support leadership class computing and the coupling of complex plasma physics models. We describe here the main components of EFFIS and how they are being utilized to address our goal of integrated predictive plasma edge simulation.

  8. An End-To-End Test of A Simulated Nuclear Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    VanDyke, Melissa; Hrbud, Ivana; Goddfellow, Keith; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase I Space Fission Systems issues in it particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  9. A Bottom-up Route to a Chemically End-to-End Assembly of Nanocellulose Fibers.

    PubMed

    Yang, Han; van de Ven, Theo G M

    2016-06-13

    In this work, we take advantage of the rod-like structure of electrosterically stabilized nanocrystalline cellulose (ENCC, with a width of about 7 nm and a length of about 130 nm), which has dicarboxylated cellulose (DCC) chains protruding from both ends, providing electrosterical stability for ENCC particles, to chemically end-to-end assemble these particles into nanocellulose fibers. ENCC with shorter DCC chains can be obtained by a mild hydrolysis of ENCC with HCl, and subsequently the hydrolyzed ENCC (HENCC, with a width of about 6 nm and a length of about 120 nm) is suitable to be assembled into high aspect ratio nanofibers by chemically cross-linking HENCC from one end to another. Two sets of HENCC were prepared by carbodiimide-mediated formation of an alkyne and an azide derivative, respectively. Cross-linking these two sets of HENCC was performed by a click reaction. HENCCs were also end-to-end cross-linked by a bioconjugation reaction, with a diamine. From atomic force microscopy (AFM) images, about ten HENCC nanoparticles were cross-linked and formed high aspect ratio nanofibers with a width of about 6 nm and a length of more than 1 μm. PMID:27211496

  10. End-to-end validation process for the INTA-Nanosat-1B Attitude Control System

    NASA Astrophysics Data System (ADS)

    Polo, Óscar R.; Esteban, Segundo; Cercos, Lorenzo; Parra, Pablo; Angulo, Manuel

    2014-01-01

    This paper describes the end-to-end validation process for the Attitude Control Subsystem (ACS) of the satellite INTA-NanoSat-1B (NS-1B). This satellite was launched on July 2009 and it has been fully operative since then. The development of its ACS modules required an exhaustive integration and a system-level validation program. Some of the tests were centred on the validation of the drivers of sensors and actuators and were carried out over the flying model of the satellite. Others, more complex, constituted end-to-end tests where the concurrency of modules, the real-time control requirements and even the well-formedness of the telemetry data were verified. This work presents an incremental and highly automatised way for performing the ACS validation program based on two development suites and an end-to-end validation environment. The validation environment combines a Flat Satellite (FlatSat) configuration and a real-time emulator working in closed-loop. The FlatSat is built using the NS-1B Qualification Model (QM) hardware and it can run a complete version of the on-board software with the ACS modules fully integrated. The real-time emulator, running on an industrial PC, samples the actuation signals and emulates the sensors signals to close the control loop with the FlatSat. This validation environment constitutes a low-cost alternative to the classical three axes tilt table, with the advantage of being easily configured for working under specific orbit conditions, in accordance with any of the selected tests. The approach has been successfully applied to the NS-1B in order to verify different ACS modes under multiple orbit scenarios, providing an exhaustive coverage and reducing the risk of eventual errors during the satellite's lifetime. The strategy was applied also during the validation of the maintenance and reconfiguration procedures required once the satellite was launched. This paper describes in detail the complete ACS validation process that was

  11. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an

  12. End-to-end performance analysis using engineering confidence models and a ground processor prototype

    NASA Astrophysics Data System (ADS)

    Kruse, Klaus-Werner; Sauer, Maximilian; Jäger, Thomas; Herzog, Alexandra; Schmitt, Michael; Huchler, Markus; Wallace, Kotska; Eisinger, Michael; Heliere, Arnaud; Lefebvre, Alain; Maher, Mat; Chang, Mark; Phillips, Tracy; Knight, Steve; de Goeij, Bryan T. G.; van der Knaap, Frits; Van't Hof, Adriaan

    2015-10-01

    The European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) are co-operating to develop the EarthCARE satellite mission with the fundamental objective of improving the understanding of the processes involving clouds, aerosols and radiation in the Earth's atmosphere. The EarthCARE Multispectral Imager (MSI) is relatively compact for a space borne imager. As a consequence, the immediate point-spread function (PSF) of the instrument will be mainly determined by the diffraction caused by the relatively small optical aperture. In order to still achieve a high contrast image, de-convolution processing is applied to remove the impact of diffraction on the PSF. A Lucy-Richardson algorithm has been chosen for this purpose. This paper will describe the system setup and the necessary data pre-processing and post-processing steps applied in order to compare the end-to-end image quality with the L1b performance required by the science community.

  13. End-to-End Network Simulation Using a Site-Specific Radio Wave Propagation Model

    SciTech Connect

    Djouadi, Seddik M; Kuruganti, Phani Teja; Nutaro, James J

    2013-01-01

    The performance of systems that rely on a wireless network depends on the propagation environment in which that network operates. To predict how these systems and their supporting networks will perform, simulations must take into consideration the propagation environment and how this effects the performance of the wireless network. Network simulators typically use empirical models of the propagation environment. However, these models are not intended for, and cannot be used, to predict a wireless system will perform in a specific location, e.g., in the center of a particular city or the interior of a specific manufacturing facility. In this paper, we demonstrate how a site-specific propagation model and the NS3 simulator can be used to predict the end-to-end performance of a wireless network.

  14. Data analysis pipeline for EChO end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Waldmann, Ingo P.; Pascale, E.

    2015-12-01

    Atmospheric spectroscopy of extrasolar planets is an intricate business. Atmospheric signatures typically require a photometric precision of 1×10-4 in flux over several hours. Such precision demands high instrument stability as well as an understanding of stellar variability and an optimal data reduction and removal of systematic noise. In the context of the EChO mission concept, we here discuss the data reduction and analysis pipeline developed for the EChO end-to-end simulator EChOSim. We present and discuss the step by step procedures required in order to obtain the final exoplanetary spectrum from the EChOSim `raw data' using a simulated observation of the secondary eclipse of the hot-Neptune 55 Cnc e.

  15. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  16. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be small as a 40 ppm is a difficult and exacting task. The end-to end processing of determining planetary candidates from noisy, raw photometric measurements is discussed.

  17. End-to-end performance measurement of Internet based medical applications.

    PubMed Central

    Dev, P.; Harris, D.; Gutierrez, D.; Shah, A.; Senger, S.

    2002-01-01

    We present a method to obtain an end-to-end characterization of the performance of an application over a network. This method is not dependent on any specific application or type of network. The method requires characterization of network parameters, such as latency and packet loss, between the expected server or client endpoints, as well as characterization of the application's constraints on these parameters. A subjective metric is presented that integrates these characterizations and that operates over a wide range of applications and networks. We believe that this method may be of wide applicability as research and educational applications increasingly make use of computation and data servers that are distributed over the Internet. PMID:12463816

  18. Development of a Dynamic, End-to-End Free Piston Stirling Convertor Model

    NASA Technical Reports Server (NTRS)

    Regan, Timothy F.; Gerber, Scott S.; Roth, Mary Ellen

    2004-01-01

    A dynamic model for a free-piston Stirling convertor is being developed at the NASA Glenn Research Center. The model is an end-to-end system model that includes the cycle thermodynamics, the dynamics, and electrical aspects of the system. The subsystems of interest are the heat source, the springs, the moving masses, the linear alternator, the controller, and the end-user load. The envisioned use of the model will be in evaluating how changes in a subsystem could affect the operation of the convertor. The model under development will speed the evaluation of improvements to a subsystem and aid in determining areas in which most significant improvements may be found. One of the first uses of the end-toend model will be in the development of controller architectures. Another related area is in evaluating changes to details in the linear alternator.

  19. End-to-End QoS for Differentiated Services and ATM Internetworking

    NASA Technical Reports Server (NTRS)

    Su, Hongjun; Atiquzzaman, Mohammed

    2001-01-01

    The Internet was initially design for non real-time data communications and hence does not provide any Quality of Service (QoS). The next generation Internet will be characterized by high speed and QoS guarantee. The aim of this paper is to develop a prioritized early packet discard (PEPD) scheme for ATM switches to provide service differentiation and QoS guarantee to end applications running over next generation Internet. The proposed PEPD scheme differs from previous schemes by taking into account the priority of packets generated from different application. We develop a Markov chain model for the proposed scheme and verify the model with simulation. Numerical results show that the results from the model and computer simulation are in close agreement. Our PEPD scheme provides service differentiation to the end-to-end applications.

  20. The End-to-End Pipeline for HST Slitless Spectra PHLAG

    NASA Astrophysics Data System (ADS)

    Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Micol, A.; Rosa, M. R.; Walsh, J. R.

    The Space Telescope-European Coordinating Facility (ST-ECF) is undertaking a joint project with the Canadian Astronomy Data Centre and the Space Telescope Science Institute to build a Hubble Legacy Archive (HLA) that contains science ready high level data products to be used in the Virtual Observatory (VO). The ST-ECF will provide extracted slitless spectra to the HLA, and for this purpose has developed the Pipeline for Hubble Legacy Archive Grism data (PHLAG). PHLAG is an end-to-end pipeline that performs an unsupervised reduction of slitless data taken with the Advanced Camera for Surveys (ACS) or the Near Infrared Camera and Multi Object Spectrometer (NICMOS) and ingests the VO compatible spectra into the HLA. PHLAG is a modular pipeline, and the various modules and their roles are discussed. In a pilot study, PHLAG is applied to NICMOS data taken with the G141 grism, and the first results of a run on all available data are shown.

  1. Orion MPCV GN and C End-to-End Phasing Tests

    NASA Technical Reports Server (NTRS)

    Neumann, Brian C.

    2013-01-01

    End-to-end integration tests are critical risk reduction efforts for any complex vehicle. Phasing tests are an end-to-end integrated test that validates system directional phasing (polarity) from sensor measurement through software algorithms to end effector response. Phasing tests are typically performed on a fully integrated and assembled flight vehicle where sensors are stimulated by moving the vehicle and the effectors are observed for proper polarity. Orion Multi-Purpose Crew Vehicle (MPCV) Pad Abort 1 (PA-1) Phasing Test was conducted from inertial measurement to Launch Abort System (LAS). Orion Exploration Flight Test 1 (EFT-1) has two end-to-end phasing tests planned. The first test from inertial measurement to Crew Module (CM) reaction control system thrusters uses navigation and flight control system software algorithms to process commands. The second test from inertial measurement to CM S-Band Phased Array Antenna (PAA) uses navigation and communication system software algorithms to process commands. Future Orion flights include Ascent Abort Flight Test 2 (AA-2) and Exploration Mission 1 (EM-1). These flights will include additional or updated sensors, software algorithms and effectors. This paper will explore the implementation of end-to-end phasing tests on a flight vehicle which has many constraints, trade-offs and compromises. Orion PA-1 Phasing Test was conducted at White Sands Missile Range (WSMR) from March 4-6, 2010. This test decreased the risk of mission failure by demonstrating proper flight control system polarity. Demonstration was achieved by stimulating the primary navigation sensor, processing sensor data to commands and viewing propulsion response. PA-1 primary navigation sensor was a Space Integrated Inertial Navigation System (INS) and Global Positioning System (GPS) (SIGI) which has onboard processing, INS (3 accelerometers and 3 rate gyros) and no GPS receiver. SIGI data was processed by GN&C software into thrust magnitude and

  2. The Consolidation of the End-to-End Avionics Systems Testbench

    NASA Astrophysics Data System (ADS)

    Wijnands, Quirien; Torelli, Felice; Blommestijn, Robert; Kranz, Stephan; Koster, Jean-Paul

    2014-08-01

    Over the past years, the Avionics System Test Bench (ATB) has been used to support the demonstration and validation of upcoming space avionics related standards and technologies in a representative environment. Next to this another main use-case of the facility has been to support projects in their needs of assessing particular technology related issues. In doing so, it was necessary to add activity- and project specifics to different configurations of the ATB, leading to a proliferation of facilities and technologies. In some cases however the results and lessons-learned from these efforts and activities were considered valuable to the ATB-concept in general and therefore needed preservation in the ATB mainstream for future reuse. Currently activities are ongoing to consolidate the End-To-End Avionics Systems TestBench (E2E-ATB). In this paper the resulting details of these activities are described as enhancements and improvements per ATB configuration.

  3. Enhancing End-to-End Performance of Information Services Over Ka-Band Global Satellite Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul B.; Glover, Daniel R.; Ivancic, William D.; vonDeak, Thomas C.

    1997-01-01

    The Internet has been growing at a rapid rate as the key medium to provide information services such as e-mail, WWW and multimedia etc., however its global reach is limited. Ka-band communication satellite networks are being developed to increase the accessibility of information services via the Internet at global scale. There is need to assess satellite networks in their ability to provide these services and interconnect seamlessly with existing and proposed terrestrial telecommunication networks. In this paper the significant issues and requirements in providing end-to-end high performance for the delivery of information services over satellite networks based on various layers in the OSI reference model are identified. Key experiments have been performed to evaluate the performance of digital video and Internet over satellite-like testbeds. The results of the early developments in ATM and TCP protocols over satellite networks are summarized.

  4. End-to-end communication test on variable length packet structures utilizing AOS testbed

    NASA Technical Reports Server (NTRS)

    Miller, Warner H.; Sank, V.; Fong, Wai; Miko, J.; Powers, M.; Folk, John; Conaway, B.; Michael, K.; Yeh, Pen-Shu

    1994-01-01

    This paper describes a communication test, which successfully demonstrated the transfer of losslessly compressed images in an end-to-end system. These compressed images were first formatted into variable length Consultative Committee for Space Data Systems (CCSDS) packets in the Advanced Orbiting System Testbed (AOST). The CCSDS data Structures were transferred from the AOST to the Radio Frequency Simulations Operations Center (RFSOC), via a fiber optic link, where data was then transmitted through the Tracking and Data Relay Satellite System (TDRSS). The received data acquired at the White Sands Complex (WSC) was transferred back to the AOST where the data was captured and decompressed back to the original images. This paper describes the compression algorithm, the AOST configuration, key flight components, data formats, and the communication link characteristics and test results.

  5. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGESBeta

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  6. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  7. Kinetics of end-to-end collision in short single-stranded nucleic acids.

    PubMed

    Wang, Xiaojuan; Nau, Werner M

    2004-01-28

    A novel fluorescence-based method, which entails contact quenching of the long-lived fluorescent state of 2,3-diazabicyclo[2.2.2]-oct-2-ene (DBO), was employed to measure the kinetics of end-to-end collision in short single-stranded oligodeoxyribonucleotides of the type 5'-DBO-(X)n-dG with X = dA, dC, dT, or dU and n = 2 or 4. The fluorophore was covalently attached to the 5' end and dG was introduced as an efficient intrinsic quencher at the 3' terminus. The end-to-end collision rates, which can be directly related to the efficiency of intramolecular fluorescence quenching, ranged from 0.1 to 9.0 x 10(6) s(-1). They were strongly dependent on the strand length, the base sequence, as well as the temperature. Oligonucleotides containing dA in the backbone displayed much slower collision rates and significantly higher positive activation energies than strands composed of pyrimidine bases, suggesting a higher intrinsic rigidity of oligoadenylate. Comparison of the measured collision rates in short single-stranded oligodeoxyribonucleotides with the previously reported kinetics of hairpin formation indicates that the intramolecular collision is significantly faster than the nucleation step of hairpin closing. This is consistent with the configurational diffusion model suggested by Ansari et al. (Ansari, A.; Kuznetsov, S. V.; Shen, Y. Proc.Natl. Acad. Sci. USA 2001, 98, 7771-7776), in which the formation of misfolded loops is thought to slow hairpin formation. PMID:14733555

  8. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  9. End-to-end test of spatial accuracy in Gamma Knife treatments for trigeminal neuralgia

    SciTech Connect

    Brezovich, Ivan A. Wu, Xingen; Duan, Jun; Popple, Richard A.; Shen, Sui; Benhabib, Sidi; Huang, Mi; Christian Dobelbower, M.; Fisher III, Winfield S.

    2014-11-01

    Purpose: Spatial accuracy is most crucial when small targets like the trigeminal nerve are treated. Although current quality assurance procedures typically verify that individual apparatus, like the MRI scanner, CT scanner, Gamma Knife, etc., are meeting specifications, the cumulative error of all equipment and procedures combined may exceed safe margins. This study uses an end-to-end approach to assess the overall targeting errors that may have occurred in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 in.) diameter MRI-contrast filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the location of the cavity matches the Gamma Knife coordinates of an arbitrarily chosen, previously treated patient. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pinprick to identify the cavity center. Treatments are planned for radiation delivery with 4 mm collimators according to MRI and CT scans using the clinical localizer boxes and acquisition protocols. Shots are planned so that the 50% isodose surface encompasses the cavity. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pinprick, which represents the intended target, and the centroid of the 50% isodose line, which is the center of the radiation field that was actually delivered. Results: Averaged over ten patient simulations, targeting errors along the x, y, and z coordinates (patient’s left-to-right, posterior-to-anterior, and head-to-foot) were, respectively, −0.060 ± 0.363, −0.350 ± 0.253, and 0.348 ± 0.204 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely, 0.109 ± 0.167, −0.191 ± 0.144, and 0.211 ± 0.094 mm. The largest errors along individual axes in MRI

  10. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms.

    PubMed

    Sun, Jidi; Dowling, Jason; Pichler, Peter; Menk, Fred; Rivest-Henault, David; Lambert, Jonathan; Parker, Joel; Arm, Jameen; Best, Leah; Martin, Jarad; Denham, James W; Greer, Peter B

    2015-04-21

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation.A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities. Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor's 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs.The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT. PMID:25803177

  11. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    NASA Astrophysics Data System (ADS)

    Sun, Jidi; Dowling, Jason; Pichler, Peter; Menk, Fred; Rivest-Henault, David; Lambert, Jonathan; Parker, Joel; Arm, Jameen; Best, Leah; Martin, Jarad; Denham, James W.; Greer, Peter B.

    2015-04-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation. A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities. Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs. The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT.

  12. End-to-end simulation of bunch merging for a muon collider

    SciTech Connect

    Bao, Yu; Stratakis, Diktys; Hanson, Gail G.; Palmer, Robert B.

    2015-05-03

    Muon accelerator beams are commonly produced indirectly through pion decay by interaction of a charged particle beam with a target. Efficient muon capture requires the muons to be first phase-rotated by rf cavities into a train of 21 bunches with much reduced energy spread. Since luminosity is proportional to the square of the number of muons per bunch, it is crucial for a Muon Collider to use relatively few bunches with many muons per bunch. In this paper we will describe a bunch merging scheme that should achieve this goal. We present for the first time a complete end-to-end simulation of a 6D bunch merger for a Muon Collider. The 21 bunches arising from the phase-rotator, after some initial cooling, are merged in longitudinal phase space into seven bunches, which then go through seven paths with different lengths and reach the final collecting "funnel" at the same time. The final single bunch has a transverse and a longitudinal emittance that matches well with the subsequent 6D rectilinear cooling scheme.

  13. End-To-End performance test of the LINC-NIRVANA Wavefront-Sensor system.

    NASA Astrophysics Data System (ADS)

    Berwein, Juergen; Bertram, Thomas; Conrad, Al; Briegel, Florian; Kittmann, Frank; Zhang, Xiangyu; Mohr, Lars

    2011-09-01

    LINC-NIRVANA is an imaging Fizeau interferometer, for use in near infrared wavelengths, being built for the Large Binocular Telescope. Multi-conjugate adaptive optics (MCAO) increases the sky coverage and the field of view over which diffraction limited images can be obtained. For its MCAO implementation, Linc-Nirvana utilizes four total wavefront sensors; each of the two beams is corrected by both a ground-layer wavefront sensor (GWS) and a high-layer wavefront sensor (HWS). The GWS controls the adaptive secondary deformable mirror (DM), which is based on an DSP slope computing unit. Whereas the HWS controls an internal DM via computations provided by an off-the-shelf multi-core Linux system. Using wavefront sensor data collected from a prior lab experiment, we have shown via simulation that the Linux based system is sufficient to operate at 1kHz, with jitter well below the needs of the final system. Based on that setup we tested the end-to-end performance and latency through all parts of the system which includes the camera, the wavefront controller, and the deformable mirror. We will present our loop control structure and the results of those performance tests.

  14. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    PubMed

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis. PMID:26901845

  15. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    PubMed

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER. PMID:22574002

  16. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    NASA Astrophysics Data System (ADS)

    Nord, B.; Amara, A.; Réfrégier, A.; Gamper, La.; Gamper, Lu.; Hambrecht, B.; Chang, C.; Forero-Romero, J. E.; Serrano, S.; Cunha, C.; Coles, O.; Nicola, A.; Busha, M.; Bauer, A.; Saunders, W.; Jouvel, S.; Kirk, D.; Wechsler, R.

    2016-04-01

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.

  17. End-to-end performance modeling of passive remote sensing systems

    SciTech Connect

    Smith, B.W.; Borel, C.C.; Clodius, W.B.; Theiler, J.; Laubscher, B.; Weber, P.G.

    1996-07-01

    The ultimate goal of end-to-end system modeling is to simulate all known physical effects which determine the content of the data, before flying an instrument system. In remote sensing, one begins with a scene, viewed either statistically or dynamically, computes the radiance in each spectral band, renders the scene, transfers it through representative atmospheres to create the radiance field at an aperture, and integrates over sensor pixels. We have simulated a comprehensive sequence of realistic instrument hardware elements and the transfer of simulated data to an analysis system. This analysis package is the same as that intended for use of data collections from the real system. By comparing the analyzed image to the original scene, the net effect of nonideal system components can be understood. Iteration yields the optimum values of system parameters to achieve performance targets. We have used simulation to develop and test improved multispectral algorithms for (1) the robust retrieval of water surface temperature, water vapor column, and other quantities; (2) the preservation of radiometric accuracy during atmospheric correction and pixel registration on the ground; and (3) exploitation of on-board multispectral measurements to assess the atmosphere between ground and aperture.

  18. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  19. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks

    PubMed Central

    Suhonen, Jukka; Hämäläinen, Timo D.; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER. PMID:22574002

  20. End-to-end flood risk assessment: A coupled model cascade with uncertainty estimation

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary K.; Brasington, James

    2008-03-01

    This paper presents the case for an `End-to-End' flood inundation modeling strategy: the creation of a coupled system of models to allow continuous simulation methodology to be used to predict the magnitude and simulate the effects of high return period flood events. The framework brings together the best in current thinking on reduced complexity modeling to formulate an efficient, process-based methodology which meets the needs of today's flood mitigation strategies. The model chain is subject to stochasticity and parameter uncertainty, and integral methods to allow the propagation and quantification of uncertainty are essential in order to produce robust estimates of flood risk. Results from an experimental application are considered in terms of their implications for successful floodplain management, and compared against the deterministic methodology more commonly in use for flood risk assessment applications. The provenance of predictive uncertainty is also considered in order to identify those areas where future effort in terms of data collection or model refinement might best be directed in order to narrow prediction bounds and produce a more precise forecast.

  1. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    NASA Technical Reports Server (NTRS)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  2. Automated End-to-End Workflow for Precise and Geo-accurate Reconstructions using Fiducial Markers

    NASA Astrophysics Data System (ADS)

    Rumpler, M.; Daftry, S.; Tscharf, A.; Prettenthaler, R.; Hoppe, C.; Mayer, G.; Bischof, H.

    2014-08-01

    Photogrammetric computer vision systems have been well established in many scientific and commercial fields during the last decades. Recent developments in image-based 3D reconstruction systems in conjunction with the availability of affordable high quality digital consumer grade cameras have resulted in an easy way of creating visually appealing 3D models. However, many of these methods require manual steps in the processing chain and for many photogrammetric applications such as mapping, recurrent topographic surveys or architectural and archaeological 3D documentations, high accuracy in a geo-coordinate system is required which often cannot be guaranteed. Hence, in this paper we present and advocate a fully automated end-to-end workflow for precise and geoaccurate 3D reconstructions using fiducial markers. We integrate an automatic camera calibration and georeferencing method into our image-based reconstruction pipeline based on binary-coded fiducial markers as artificial, individually identifiable landmarks in the scene. Additionally, we facilitate the use of these markers in conjunction with known ground control points (GCP) in the bundle adjustment, and use an online feedback method that allows assessment of the final reconstruction quality in terms of image overlap, ground sampling distance (GSD) and completeness, and thus provides flexibility to adopt the image acquisition strategy already during image recording. An extensive set of experiments is presented which demonstrate the accuracy benefits to obtain a highly accurate and geographically aligned reconstruction with an absolute point position uncertainty of about 1.5 times the ground sampling distance.

  3. Telemetry Ranging: Laboratory Validation Tests and End-to-End Performance

    NASA Astrophysics Data System (ADS)

    Hamkins, J.; Kinman, P.; Xie, H.; Vilnrotter, V.; Dolinar, S.; Adams, N.; Sanchez, E.; Millard, W.

    2016-08-01

    This article reports on a set of laboratory tests of telemetry ranging conducted at Development Test Facility 21 (DTF-21) in Monrovia, California. An uplink pseudorandom noise (PN) ranging signal was generated by DTF-21, acquired by the Frontier Radio designed and built at the Johns Hopkins University Applied Physics Laboratory, and downlink telemetry frames from the radio were recorded by an open-loop receiver. In four of the tests, the data indicate that telemetry ranging can resolve the two-way time delay to a standard deviation of 2.1-3.4 ns, corresponding to about 30 to 51 cm in (one-way) range accuracy, when 30 s averaging of timing estimates is used. Other tests performed worse because of unsatisfactory receiver sampling rate, quantizer resolution, dc bias, improper configuration, or other reasons. The article also presents an analysis of the expected end-to-end performance of the telemetry ranging system. In one case considered, the theoretically-predicted performance matches the test results, within 10 percent, which provides a reasonable validation that the expected performance was achieved by the test. The analysis also shows that in one typical ranging scenario, one-way range accuracy of 1 m can be achieved with telemetry ranging when the data rate is above 2 kbps.

  4. Functional Partitioning to Optimize End-to-End Performance on Many-core Architectures

    SciTech Connect

    Li, Min; Vazhkudai, Sudharshan S; Butt, Ali R; Meng, Fei; Ma, Xiaosong; Kim, Youngjae; Engelmann, Christian; Shipman, Galen M

    2010-01-01

    Scaling computations on emerging massive-core supercomputers is a daunting task, which coupled with the significantly lagging system I/O capabilities exacerbates applications end-to-end performance. The I/O bottleneck often negates potential performance benefits of assigning additional compute cores to an application. In this paper, we address this issue via a novel functional partitioning (FP) runtime environment that allocates cores to specific application tasks - checkpointing, de-duplication, and scientific data format transformation - so that the deluge of cores can be brought to bear on the entire gamut of application activities. The focus is on utilizing the extra cores to support HPC application I/O activities and also leverage solid-state disks in this context. For example, our evaluation shows that dedicating 1 core on an oct-core machine for checkpointing and its assist tasks using FP can improve overall execution time of a FLASH benchmark on 80 and 160 cores by 43.95% and 41.34%, respectively.

  5. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites

    PubMed Central

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G.

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis. PMID:26901845

  6. The X-IFU end-to-end simulations performed for the TES array optimization exercise

    NASA Astrophysics Data System (ADS)

    Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.

    2015-09-01

    The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.

  7. A Framework for End to End Simulations of the Large Synoptic Survey Telescope

    NASA Astrophysics Data System (ADS)

    Gibson, R. R.; Ahmad, Z.; Bankert, J.; Bard, D.; Connolly, A. J.; Chang, C.; Gilmore, K.; Grace, E.; Hannel, M.; Jernigan, J. G.; Jones, L.; Kahn, S. M.; Krughoff, K. S.; Lorenz, S.; Marshall, S.; Nagarajan, S.; Peterson, J. R.; Pizagno, J.; Rasmussen, A. P.; Shmakova, M.; Silvestri, N.; Todd, N.; Young, M.

    2011-07-01

    As observatories get bigger and more complicated to operate, risk mitigation techniques become increasingly important. Additionally, the size and complexity of data coming from the next generation of surveys will present enormous challenges in how we process, store, and analyze these data. End-to-end simulations of telescopes with the scope of LSST are essential to correct problems and verify science capabilities as early as possible. A simulator can also determine how defects and trade-offs in individual subsystems impact the overall design requirements. Here, we present the architecture, implementation, and results of the source simulation framework for the Large Synoptic Survey Telescope (LSST). The framework creates time-based realizations of astronomical objects and formats the output for use in many different survey contexts (i.e., image simulation, reference catalogs, calibration catalogs, and simulated science outputs). The simulations include Milky Way, cosmological, and solar system models as well as transient and variable objects. All model objects can be sampled with the LSST cadence from any operations simulator run. The result is a representative, full-sky simulation of LSST data that can be used to determine telescope performance, the feasibility of science goals, and strategies for processing LSST-scale data volumes.

  8. Advanced end-to-end fiber optic sensing systems for demanding environments

    NASA Astrophysics Data System (ADS)

    Black, Richard J.; Moslehi, Behzad

    2010-09-01

    Optical fibers are small-in-diameter, light-in-weight, electromagnetic-interference immune, electrically passive, chemically inert, flexible, embeddable into different materials, and distributed-sensing enabling, and can be temperature and radiation tolerant. With appropriate processing and/or packaging, they can be very robust and well suited to demanding environments. In this paper, we review a range of complete end-to-end fiber optic sensor systems that IFOS has developed comprising not only (1) packaged sensors and mechanisms for integration with demanding environments, but (2) ruggedized sensor interrogators, and (3) intelligent decision aid algorithms software systems. We examine the following examples: " Fiber Bragg Grating (FBG) optical sensors systems supporting arrays of environmentally conditioned multiplexed FBG point sensors on single or multiple optical fibers: In conjunction with advanced signal processing, decision aid algorithms and reasoners, FBG sensor based structural health monitoring (SHM) systems are expected to play an increasing role in extending the life and reducing costs of new generations of aerospace systems. Further, FBG based structural state sensing systems have the potential to considerably enhance the performance of dynamic structures interacting with their environment (including jet aircraft, unmanned aerial vehicles (UAVs), and medical or extravehicular space robots). " Raman based distributed temperature sensing systems: The complete length of optical fiber acts as a very long distributed sensor which may be placed down an oil well or wrapped around a cryogenic tank.

  9. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    SciTech Connect

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  10. An end-to-end analysis of drought from smallholder farms in southwest Jamaica

    NASA Astrophysics Data System (ADS)

    Curtis, W. R. S., III; Gamble, D. W.; Popke, J.

    2015-12-01

    Drought can be defined in many ways: meteorological, hydrological, agricultural, and socio-economic. Another way to approach drought is from a "perception" perspective, where individuals whose livelihood is highly dependent on precipitation take adaptive actions. In this study we use two-years of data collected from twelve smallholder farms in southern St. Elizabeth, Jamaica to undertake an end-to-end analysis of drought. At each farm, 6-hour temperature and soil moisture, and tipping-bucket rainfall were recorded from June 2013 to June 2015, and twice-monthly farmers indicated whether they were experiencing drought and if they irrigated (hand-watering, drip irrigation, or pipe and sprinkler). In many cases half of the farmers considered themselves in a drought, while the others not, even though the largest separation among farms was about 20 km. This study will use analysis of variance to test the following hypotheses: Drought perception is related to a) absolute amounts of precipitation at the time, b) other environmental cues at the time (soil moisture, temperature), or c) relative amounts of precipitation as compared to the same time last year. Irrigation actions and water use following the perception of drought will also be examined.

  11. An end-to-end architecture for distributing weather alerts to wireless handsets

    NASA Astrophysics Data System (ADS)

    Jones, Karen L.; Nguyen, Hung

    2005-06-01

    This paper describes the current National Weather Service's (NWS) system for providing weather alerts in the U.S. and will review how the existing end-to-end architecture is being leveraged to provide non-weather alerts, also known as "all-hazard alerts", to the general public. The paper then describes how a legacy system that transmits weather and all-hazard alerts can be extended via commercial wireless networks and protocols to reach 154 million Americans who carry cell phones. This approach uses commercial SATCOM and existing wireless carriers and services such as Short Messaging Service (SMS) for text and emerging Multimedia Messaging Service (MMS) protocol, which would allow for photos, maps, audio and video alerts to be sent to end users. This wireless broadcast alert delivery architecture is designed to be open and to embrace the National Weather Service's mandate to become an "" warning system for the general public. Examples of other public and private sector applications that require timely and intelligent push mechanisms using this alert dissemination approach are also given.

  12. End-to-end differential contactless conductivity sensor for microchip capillary electrophoresis.

    PubMed

    Fercher, Georg; Haller, Anna; Smetana, Walter; Vellekoop, Michael J

    2010-04-15

    In this contribution, a novel measurement approach for miniaturized capillary electrophoresis (CE) devices is presented: End-to-end differential capacitively coupled contactless conductivity measurement. This measurement technique is applied to a miniaturized CE device fabricated in low-temperature cofired ceramics (LTCC) multilayer technology. The working principle is based on the placement of two distinct detector areas near both ends of the fluid inlet and outlet of the separation channel. Both output signals are subtracted from each other, and the resulting differential signal is amplified and measured. This measurement approach has several advantages over established, single-end detectors: The high baseline level resulting from parasitic stray capacitance and buffer conductivity is reduced, leading to better signal-to-noise ratio and hence higher measurement sensitivity. Furthermore, temperature and, thus, baseline drift effects are diminished owing to the differentiating nature of the system. By comparing the peak widths measured with both detectors, valuable information about zone dispersion effects arising during the separation is obtained. Additionally, the novel measurement scheme allows the determination of dispersion effects that occur at the time of sample injection. Optical means of dispersion evaluation are ineffective because of the opaque LTCC substrate. Electrophoretic separation experiments of inorganic ions show sensitivity enhancements by about a factor of 30-60 compared to the single-end measurement scheme. PMID:20337422

  13. Semantic Complex Event Processing over End-to-End Data Flows

    SciTech Connect

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  14. The efficacy of end-to-end and end-to-side nerve repair (neurorrhaphy) in the rat brachial plexus

    PubMed Central

    Liao, Wen-Chieh; Chen, Jeng-Rung; Wang, Yueh-Jan; Tseng, Guo-Fang

    2009-01-01

    Proximal nerve injury often requires nerve transfer to restore function. Here we evaluated the efficacy of end-to-end and end-to-side neurorrhaphy of rat musculocutaneous nerve, the recipient, to ulnar nerve, the donor. The donor was transected for end-to-end, while an epineurial window was exposed for end-to-side neurorrhaphy. Retrograde tracing showed that 70% donor motor and sensory neurons grew into the recipient 3 months following end-to-end neurorrhaphy compared to 40–50% at 6 months following end-to-side neurorrhaphy. In end-to-end neurorrhaphy, regenerating axons appeared as thick fibers which regained diameters comparable to those of controls in 3–4 months. However, end-to-side neurorrhaphy induced slow sprouting fibers of mostly thin collaterals that barely approached control diameters by 6 months. The motor end plates regained their control density at 4 months following end-to-end but remained low 6 months following end-to-side neurorrhaphy. The short-latency compound muscle action potential, typical of that of control, was readily restored following end-to-end neurorrhaphy. End-to-side neurorrhaphy had low amplitude and wide-ranging latency at 4 months and failed to regain control sizes by 6 months. Grooming test recovered successfully at 3 and 6 months following end-to-end and end-to-side neurorrhaphy, respectively, suggesting that powerful muscle was not required. In short, both neurorrhaphies resulted in functional recovery but end-to-end neurorrhaphy was quicker and better, albeit at the expense of donor function. End-to-side neurorrhaphy supplemented with factors to overcome the slow collateral sprouting and weak motor recovery may warrant further exploration. PMID:19682138

  15. End-to-end small bowel anastomosis by temperature controlled CO2 laser soldering and an albumin stent: a feasibility study

    NASA Astrophysics Data System (ADS)

    Simhon, David; Kopelman, Doron; Hashmonai, Moshe; Vasserman, Irena; Dror, Michael; Vasilyev, Tamar; Halpern, Marissa; Kariv, Naam; Katzir, Abraham

    2004-07-01

    Introduction: A feasibility study of small intestinal end to end anastomosis was performed in a rabbit model using temperature controlled CO2 laser system and an albumin stent. Compared with standard suturing or clipping, this method does not introduce foreign materials to the repaired wound and therefore, may lead to better and faster wound healing of the anastomotic site. Methods: Transected rabbits small intestines were either laser soldered using 47% bovine serum albumin and intraluminal albumin stent or served as controls in which conventional continuous two-layer end to end anastomosis was performed manually. The integrity of the anastomosis was investigated at the 14th postoperative day. Results: Postoperative course in both treatments was uneventful. The sutured group presented signs of partial bowel obstruction. Macroscopically, no signs of intraluminal fluid leakage were observed in both treatments. Yet, laser soldered intestinal anastomoses demonstrated significant superiority with respect to adhesions and narrowing of the intestinal lumen. Serial histological examinations revealed better wound healing characteristics of the laser soldered anastomotic site. Conclusion: Laser soldering of intestinal end to end anastomosis provide a faster surgical procedure, compared to standard suture technique, with better wound healing results. It is expected that this technique may be adopted in the future for minimal invasive surgeries.

  16. End-to-End Models for Effects of System Noise on LIMS Analysis of Igneous Rocks

    SciTech Connect

    Clegg, Samuel M; Bender, Steven; Wiens, R. C.; Carmosino, Marco L; Speicher, Elly A; Dyar, M. D.

    2010-12-23

    The ChemCam instrument on the Mars Science Laboratory will be the first extraterrestial deployment of laser-induced breakdown spectroscopy (UBS) for remote geochemical analysis. LIBS instruments are also being proposed for future NASA missions. In quantitative LIBS applications using multivariate analysis techniques, it is essential to understand the effects of key instrument parameters and their variability on the elemental predictions. Baseline experiments were run on a laboratory instrument in conditions reproducing ChemCam performance on Mars. These experiments employed Nd:YAG laser producing 17 mJ/pulse on target and an with a 200 {micro}m FWHM spot size on the surface of a sample. The emission is collected by a telescope, imaged on a fiber optic and then interfaced to a demultiplexer capable of >40% transmission into each spectrometer. We report here on an integrated end-to-end system performance model that simulates the effects of output signal degradation that might result from the input signal chain and the impact on multivariate model predictions. There are two approaches to modifying signal to noise (SNR): degrade the signal and/or increase the noise. Ishibashi used a much smaller data set to show that the addition of noise had significant impact while degradation of spectral resolution had much less impact on accuracy and precision. Here, we specifically focus on aspects of remote LIBS instrument performance as they relate to various types of signal degradation. To assess the sensitivity of LIBS analysis to signal-to-noise ratio (SNR) and spectral resolution, the signal in each spectrum from a suite of 50 laboratory spectra of igneous rocks was variably degraded by increasing the peak widths (simulating misalignment) and decreasing the spectral amplitude (simulating decreases in SNR).

  17. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    NASA Astrophysics Data System (ADS)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  18. Advanced End-to-end Simulation for On-board Processing (AESOP)

    NASA Technical Reports Server (NTRS)

    Mazer, Alan S.

    1994-01-01

    Developers of data compression algorithms typically use their own software together with commercial packages to implement, evaluate and demonstrate their work. While convenient for an individual developer, this approach makes it difficult to build on or use another's work without intimate knowledge of each component. When several people or groups work on different parts of the same problem, the larger view can be lost. What's needed is a simple piece of software to stand in the gap and link together the efforts of different people, enabling them to build on each other's work, and providing a base for engineers and scientists to evaluate the parts as a cohesive whole and make design decisions. AESOP (Advanced End-to-end Simulation for On-board Processing) attempts to meet this need by providing a graphical interface to a developer-selected set of algorithms, interfacing with compiled code and standalone programs, as well as procedures written in the IDL and PV-Wave command languages. As a proof of concept, AESOP is outfitted with several data compression algorithms integrating previous work on different processors (AT&T DSP32C, TI TMS320C30, SPARC). The user can specify at run-time the processor on which individual parts of the compression should run. Compressed data is then fed through simulated transmission and uncompression to evaluate the effects of compression parameters, noise and error correction algorithms. The following sections describe AESOP in detail. Section 2 describes fundamental goals for usability. Section 3 describes the implementation. Sections 4 through 5 describe how to add new functionality to the system and present the existing data compression algorithms. Sections 6 and 7 discuss portability and future work.

  19. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  20. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    PubMed

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  1. SPoRT - An End-to-End R2O Activity

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  2. Implementation and evaluation of an end-to-end IGRT test.

    PubMed

    Kry, Stephen F; Jones, Jimmy; Childress, Nathan L

    2012-01-01

    The goal of this work was to develop and evaluate an end-to-end test for determining and verifying image-guided radiation therapy setup accuracy relative to the radiation isocenter. This was done by placing a cube phantom with a central tungsten sphere directly on the treatment table and offset from isocenter either by 5.0 mm in the longitudinal, lateral, and vertical dimensions or by a random amount. A high-resolution cone-beam CT image was acquired and aligned with the tungsten sphere in the reference CT image. The table was shifted per this alignment, and megavoltage anterior-posterior and lateral images were acquired with the electronic portal imaging device. Agreement between the radiation isocenter (based on the MV field) and the center of the sphere (i.e., the alignment point based on kV imaging) was determined for each image via Winston-Lutz analysis. This procedure was repeated 10 times to determine short-term reproducibility, and then repeated daily for 51 days in a clinical setting. The short-term reproducibility test yielded a mean 3D vector displacement of 0.9 ± 0.15 mm between the imaging-based isocenter and the radiation isocenter, with a maximum displacement of 1.1 mm. The clinical reproducibility test yielded a mean displacement of1.1 ± 0.4 mm with a maximum of 2.0 mm when the cube was offset by 5.0 mm, and a mean displacement of 0.9 ± 0.3 mm with a maximum of 1.8 mm when the cube was offset by a random amount. These differences were observed in all directions and were independent of the magnitude of the couch shift. This test was quick and easy to implement clinically and highlighted setup inaccuracies in an image-guided radiation therapy environment. PMID:22955659

  3. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  4. Identifying Elusive Electromagnetic Counterparts to Gravitational Wave Mergers: An End-to-end Simulation

    NASA Astrophysics Data System (ADS)

    Nissanke, Samaya; Kasliwal, Mansi; Georgieva, Alexandra

    2013-04-01

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg2 (or 6-65 deg2) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies that would

  5. SU-E-T-150: End to End Tests On the First Clinical EDGETM

    SciTech Connect

    Scheib, S; Schmelzer, P; Vieira, S; Greco, C

    2014-06-01

    Purpose: To quantify the sub millimeter overall accuracy of EDGETM, the dedicated linac based SRS/SABR treatment platform from Varian, using a novel End-to-End (E2E) test phantom. Methods: The new E2E test phantom developed by Varian consists of a cube with an outer dimension of 15x15x15 cm3. The phantom is equipped with an exchangable inner cube (7×7×7 cm3) to hold radiochromic films or a tungsten ball (diameter = 5 mm) for Winston-Lutz tests. 16 ceramic balls (diameter = 5 mm) are embedded in the outer cube. Three embedded Calypso transponders allow for Calypso based monitoring. The outer surface of the phantom is tracked using the Optical Surface Monitoring System (OSMS). The phantom is positioned using kV, MV and CBCT images. A simCT of the phantom was acquired and SRS/SABR plans were treated using the new phantom on the first clinical installed EDGETM. As a first step a series of EPID based Winston-Lutz tests have been performed. As a second step the calculated dose distribution applied to the phantom was verified with radiochromic films in orthogonal planes. The measured dose distribution is compared with the calculated (Eclipse) one based on the known isocenter on both dose distributions. The geometrical shift needed to match both dose distributions is the overall accuracy and is determined using dose profiles, isodose lines or gamma pass rates (3%, 1 mm). Results: Winston-Lutz tests using the central tungsten BB demonstrated a targeting accuracy of 0.44±0.18mm for jaw (2cm × 2cm) defined 0.39±0.19mm for MLC (2cm × 2cm) defined and 0.37±0.15mm for cone (12.5 mm) defined fields. A treated patient plan (spinal metastases lesion with integrated boost) showed a dosimetric dose localization accuracy of 0.6mm. Conclusion: Geometric and dosimetric E2E tests on EDGETM, show sub-millimeter E2E targeting and dose localisation accuracy.

  6. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  7. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    SciTech Connect

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-04-20

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg{sup 2} (or 6-65 deg{sup 2}) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies

  8. End-to-End Self-Assembly of Semiconductor Nanorods in Water by Using an Amphiphilic Surface Design.

    PubMed

    Taniguchi, Yuki; Takishita, Takao; Kawai, Tsuyoshi; Nakashima, Takuya

    2016-02-01

    One-dimensional (1D) self-assemblies of nanocrystals are of interest because of their vectorial and polymer-like dynamic properties. Herein, we report a simple method to prepare elongated assemblies of semiconductor nanorods (NRs) through end-to-end self-assembly. Short-chained water-soluble thiols were employed as surface ligands for CdSe NRs having a wurtzite crystal structure. The site-specific capping of NRs with these ligands rendered the surface of the NRs amphiphilic. The amphiphilic CdSe NRs self-assembled to form elongated wires by end-to-end attachment driven by the hydrophobic effect operating between uncapped NR ends. The end-to-end assembly technique was further applied to CdS NRs and CdSe tetrapods (TPs) with a wurtzite structure. PMID:26836341

  9. Mechanism of Shaft End-To-End Voltage Generation by Asymmetry in an Inverter-Driven Motor

    NASA Astrophysics Data System (ADS)

    Asakura, Yusuke; Akagi, Hirofumi

    This paper deals with the shaft end-to-end voltage resulting from asymmetric stray capacitances in an inverter-driven motor. The origin of the voltage can be any of the following: a ground leakage current, dielectric breakdown in bearings, and asymmetric stray capacitances on stator windings. The third origin seems to be related to the differential-mode current, but the details of the relationship have not been clarified. In this study, differential-mode tests are carried out on an ungrounded motor rated at 400V and 15kW, and the shaft end-to-end voltage generation by the asymmetric stray capacitances is theoretically discussed. Finaly, a winding model is presented for the purpose of understanding the mechanism responsible for the shaft end-to-end voltage.

  10. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  11. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services Over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  12. Metal-Metal and π-π Interactions Directed End-to-End Assembly of Gold Nanorods.

    PubMed

    Leung, Frankie Chi-Ming; Leung, Sammual Yu-Lut; Chung, Clive Yik-Sham; Yam, Vivian Wing-Wah

    2016-03-01

    The end-to-end aggregation of gold nanorods (GNRs) has been demonstrated to be directed by a thioacetate-containing alkynylplatinum(II) terpyridine complex. The in situ deprotected complex is preferentially attached at the ends of the gold nanorods (GNRs) and induce the aggregation of GNRs in an "end-to-end" manner by Pt···Pt and π-π interactions, which have been characterized by electron microscopy, energy dispersed X-ray (EDX) analysis, and UV-vis absorption spectroscopy. The assembly of the nanorods into chain-like nanostructures can be controlled by the concentration of the Pt(II) complexes. PMID:26914346

  13. Influence of end-to-end diffusion on intramolecular energy transfer as observed by frequency-domain fluorometry

    NASA Astrophysics Data System (ADS)

    Lakowicz, Joseph R.; Wiczk, Wieslaw M.; Gryczynski, Ignacy; Szmacinski, Henryk; Johnson, Michael L.

    1990-05-01

    We investigated the influence of end-to-end diffusion on intramolecular energy transfer between a naphthalene donor and dansyl acceptor linked by polymethylene chain. A range of viscosities of 0.6 - 200cP were obtained using propylene glycol at different temperatures (0-80°C) and methanol at 20°C. The intensity decays of naphthalene were measured in frequency-domain. Several theoretical models, including distance distributions were used to fit the data. The results indicate that end-to-end diffusion of flexible donor - acceptor pairs can be readily detected and quantified using frequency-domain fluorometry.

  14. On the importance of risk knowledge for an end-to-end tsunami early warning system

    NASA Astrophysics Data System (ADS)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  15. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    NASA Astrophysics Data System (ADS)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  16. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    NASA Astrophysics Data System (ADS)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  17. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  18. A vision for end-to-end data services to foster international partnerships through data sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  19. Integration proposal through standard-based design of an end-to-end platform for p-Health environments.

    PubMed

    Martíínez, I; Trigo, J D; Martínez-Espronceda, M; Escayola, J; Muñoz, P; Serrano, L; García, J

    2009-01-01

    Interoperability among medical devices and compute engines in the personal environment of the patient, and with healthcare information systems in the remote monitoring and management process is a key need that requires developments supported on standard-based design. Even though there have been some international initiatives to combine different standards, the vision of an entire end-to-end standard-based system is the next challenge. This paper presents the implementation guidelines of a ubiquitous platform for Personal Health (p-Health). It is standard-based using the two main medical norms in this context: ISO/IEEE11073 in the patient environment for medical device interoperability, and EN13606 to allow the interoperable communication of the Electronic Healthcare Record of the patient. Furthermore, the proposal of a new protocol for End-to-End Standard Harmonization (E2ESHP) is presented in order to make possible the end-to-end standard integration. The platform has been designed to comply with the last ISO/IEEE11073 and EN13606 available versions, and tested in a laboratory environment as a proof-of-concept to illustrate its feasibility as an end-to-end standard-based solution. PMID:19963613

  20. A vision for end-to-end data services to foster international partnerships through data sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  1. End-to-End Network QoS via Scheduling of Flexible Resource Reservation Requests

    SciTech Connect

    Sharma, S.; Katramatos, D.; Yu, D.

    2011-11-14

    Modern data-intensive applications move vast amounts of data between multiple locations around the world. To enable predictable and reliable data transfer, next generation networks allow such applications to reserve network resources for exclusive use. In this paper, we solve an important problem (called SMR3) to accommodate multiple and concurrent network reservation requests between a pair of end-sites. Given the varying availability of bandwidth within the network, our goal is to accommodate as many reservation requests as possible while minimizing the total time needed to complete the data transfers. We first prove that SMR3 is an NP-hard problem. Then we solve it by developing a polynomial-time heuristic, called RRA. The RRA algorithm hinges on an efficient mechanism to accommodate large number of requests by minimizing the bandwidth wastage. Finally, via numerical results, we show that RRA constructs schedules that accommodate significantly larger number of requests compared to other, seemingly efficient, heuristics.

  2. Wiener restoration of sampled image data - End-to-end analysis

    NASA Technical Reports Server (NTRS)

    Fales, Carl L.; Huck, Friedrich O.; Mccormick, Judith A.; Park, Stephen K.

    1988-01-01

    The Wiener filter is formulated as a function of the basic image-gathering and image-reconstruction constraints, thereby providing a method for minimizing the mean-squared error between the (continuous-input) radiance field and its restored (continuous-output) representation. This formulation of the Wiener filter is further extended to the Wiener-characteristic filter, which provides a method for explicitly specifying the desired representation. Two specific examples of Wiener filters are presented.

  3. End-to-End Demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power Conversion and Ion Engine Operation

    NASA Technical Reports Server (NTRS)

    Hrbud, Ivana; VanDyke, Melissa; Houts, Mike; Goodfellow, Keith; Schafer, Charles (Technical Monitor)

    2001-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  4. End-to-End demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power conversion and ion engine operation

    NASA Astrophysics Data System (ADS)

    Hrbud, Ivana; van Dyke, Melissa; Houts, Mike; Goodfellow, Keith

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems. .

  5. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  6. Refinery Outages

    EIA Publications

    2015-01-01

    Semiannual reporting on refinery outages and their potential implications for available refinery capacity, petroleum product markets, and supply of gasoline, diesel fuel, and heating oil. Dissemination of such analyses can be beneficial to market participants who may otherwise be unable to access such information.

  7. Power Outages

    MedlinePlus

    ... car’s gas tank full-gas stations rely on electricity to power their pumps.If you use your car to ... or garage, or connect it to your home's electrical system. For more information about generators visit: After a Power Outage Throw away any food that has been ...

  8. Modified end-to-end anastomosis for the treatment of congenital tracheal stenosis with a bridging bronchus.

    PubMed

    Stock, Cameron; Nathan, Meena; Murray, Ryan; Rahbar, Reza; Fynn-Thompson, Francis

    2015-01-01

    An infant with a ventricular septal defect; Vertebral anomalies, Anal atresia, Cardiac anomalies, Tracho Esophageal fistula (TEF), Renal anomalies, Limb anomalies syndrome; and tracheal stenosis with a bridging bronchus underwent repair of the ventricular septal defect and trachea-bronchial reconstruction at age 11 months. Herein we describe our surgical approach to resection of the bridging bronchus and a technique using a modified end-to-end tracheal anastomosis for the correction of this complex anomaly. PMID:25555968

  9. End-to-end wireless TCP with noncongestion packet loss detection and handling

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Joon; Liu, Fang; Kuo, C.-C. Jay

    2003-07-01

    Traditional TCP performance degrades over lossy links, as the TCP sender assumes that packet loss is caused by congestion in the network path and thus reduces the sending rate by cutting the congestion window multiplicatively, and a mechanism to overcome this limitation is investigated in this research. Our scheme identifies the network path condition to differentiate whether congestion happens or not, and responds differently. The basic idea of separating congestion and non-congestion caused losses is to compare the estimated current available bandwidth and the average available bandwidth. To minimize the effect of temporary fluctuation of measurements, we estimate the available bandwidth with a higher weight on stable measurements and a lower weight on unstable fluctuations. In our scheme, packet loss due to congestion invokes the TCP Newreno procedure. In cases of random loss that is not related to congestion, the multiplicative decrease of the sending rate is avoided to achieve higher throughput. In addition, each duplicate acknowledgement after a fast retransmission will increase the congestion window to fully recover its sending rate. Extensive simulation results show that our differentiation algorithm achieves high accuracy. Accordingly, the TCP connection over lossy link with the proposed scheme provides higher throughput than TCP Newreno.

  10. An End-to-End Architecture for Science Goal Driven Observing

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Koratkar, Anuradha; Memarsadeghi, Nargess; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    New observatories will have greater on-board storage capacity and on-board processing capabilities. The new bottleneck will be download capacity. The cost of downlink time and limitations of bandwidth will end the era where all exposure data is downloaded and all data processing is performed on the ground. In addition, observing campaigns involving inherently variable targets will need scheduling flexibility to focus observing time and data download on exposures that are scientifically interesting. The ability to quickly recognize and react to such events by re-prioritizing the observing schedule will be an essential characteristic for maximizing scientific returns. It will also be a step towards increasing spacecraft autonomy, a major goal of NASA's strategic plan. The science goal monitoring (SGM) system is a proof-of-concept effort to address these challenges. We are developing an interactive distributed system that will use on-board processing and storage combined with event-driven interfaces with ground-based processing and operations, to enable fast re-prioritization of observing schedules, and to minimize time spent on non-optimized observations. SGM is initially aimed towards time-tagged observing modes used frequently in spectroscopic studies of varying targets. In particular, the SGM is collaborating with the proposed MIDEX-class mission Kronos team. The variable targets that Kronos seeks to study make an adaptive system such as SGM particularly valuable for achieving mission goals. However, the architecture and interfaces will also be designed for easy adaptability to other observing platforms, including ground-based systems and to work with different scheduling and pipeline processing systems. This talk will focus on our strategy for developing SGM and the technical challenges that we have encountered. We will discuss the SGM architecture as it applies to the Kronos mission and explain how it is scalable to other missions.

  11. Mechanical loading of peripheral nerves during remobilisation of the affected member after end-to-end anastomosis.

    PubMed

    Orf, G; Wüst, R

    1979-01-01

    Our study involved simulating end-to-end neurorrhaphy of the sciatic nerve in a number of rabbits and analysing in vivo the mechanical loads acting on the nerve while the affected member was being remobilised. We found both the suture and mobilisation loads to be related to the size of the nerve defect. In each case, traction force, strain, and stress were proportional. The effect which these experimental findings may have on the future use of flexing neighbouring joints as a "manipulative" measure to achieve a tension-free nerve suture will be discussed. PMID:525461

  12. End-to-end testing. [to verify electrical equipment failure due to carbon fibers released in aircraft-fuel fires

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1979-01-01

    The principle objective of the kinds of demonstration tests that are discussed is to try to verify whether or not carbon fibers that are released by burning composite parts in an aircraft-fuel fires can produce failures in electrical equipment. A secondary objective discussed is to experimentally validate the analytical models for some of the key elements in the risk analysis. The approach to this demonstration testing is twofold: limited end-to-end test are to be conducted in a shock tube; and planning for some large outdoor burn tests is being done.

  13. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Cusp

    NASA Technical Reports Server (NTRS)

    Coffey, V. N.; Chandler, M. O.; Singh, Nagendra; Avanov, Levon

    2005-01-01

    This paper describes a study of the effects of unstable magnetosheath distributions on the cusp ionosphere. An end-to-end numerical model was used to study, first, the evolved distributions from precipitation due to reconnection and, secondly, the energy transfer into the high latitude ionosphere based on these solar wind/magnetosheath inputs. Using inputs of several representative examples of magnetosheath injections, waves were generated at the lower hybrid frequency and energy transferred to the ionospheric electrons and ions. The resulting wave spectra and ion and electron particle heating was analyzed. Keywords: Ion heating: Magnetosheath/Ionosphere coupling: Particle/Wave Interactions. Simulations

  14. Automated Design of Propellant-Optimal, End-to-End, Low-Thrust Trajectories for Trojan Asteroid Tours

    NASA Technical Reports Server (NTRS)

    Stuart, Jeffrey; Howell, Kathleen; Wilson, Roby

    2013-01-01

    The Sun-Jupiter Trojan asteroids are celestial bodies of great scientific interest as well as potential resources offering water and other mineral resources for longterm human exploration of the solar system. Previous investigations under this project have addressed the automated design of tours within the asteroid swarm. This investigation expands the current automation scheme by incorporating options for a complete trajectory design approach to the Trojan asteroids. Computational aspects of the design procedure are automated such that end-to-end trajectories are generated with a minimum of human interaction after key elements and constraints associated with a proposed mission concept are specified.

  15. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  16. Evaluation of the end-to-end distance of chains solubilized in a polymer Langmuir monolayer by atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Kumaki, Jiro

    Polymer chain packing in two-dimensional (2D) condense state is still not well understood. Direct observation of the chain packing in a monolayer should be the best way to understand this, however, it is still difficult even using atomic force microscopy (AFM) except for extraordinarily thick polymers. In this study, we successfully evaluate the end-to-end distance of the chains in a Langmuir-Blodgett monolayer composed of a conventional polymer by AFM. We successfully solubilized a small amount of a polystyrene-b-poly(methyl methacrylate)-b-polystyrene (PS-b-PMMA-b-PS) triblock copolymer in a PMMA Langmuir monolayer with the PS blocks being condensed as single-PS-block particles which could be used as a probe of the position of the chain ends. The evaluated end-to-end distance was 2.5 times longer than that of the 2D ideal chain, indicating the chains in the 2D monolayer are not strongly segregated but interpenetrates into other chains.

  17. Context-driven, prescription-based personal activity classification: methodology, architecture, and end-to-end implementation.

    PubMed

    Xu, James Y; Chang, Hua-I; Chien, Chieh; Kaiser, William J; Pottie, Gregory J

    2014-05-01

    Enabling large-scale monitoring and classification of a range of motion activities is of primary importance due to the need by healthcare and fitness professionals to monitor exercises for quality and compliance. Past work has not fully addressed the unique challenges that arise from scaling. This paper presents a novel end-to-end system solution to some of these challenges. The system is built on the prescription-based context-driven activity classification methodology. First, we show that by refining the definition of context, and introducing the concept of scenarios, a prescription model can provide personalized activity monitoring. Second, through a flexible architecture constructed from interface models, we demonstrate the concept of a context-driven classifier. Context classification is achieved through a classification committee approach, and activity classification follows by means of context specific activity models. Then, the architecture is implemented in an end-to-end system featuring an Android application running on a mobile device, and a number of classifiers as core classification components. Finally, we use a series of experimental field evaluations to confirm the expected benefits of the proposed system in terms of classification accuracy, rate, and sensor operating life. PMID:24107984

  18. Far-Infrared Therapy Promotes Nerve Repair following End-to-End Neurorrhaphy in Rat Models of Sciatic Nerve Injury

    PubMed Central

    Chen, Tai-Yuan; Yang, Yi-Chin; Sha, Ya-Na; Chou, Jiun-Rou

    2015-01-01

    This study employed a rat model of sciatic nerve injury to investigate the effects of postoperative low-power far-infrared (FIR) radiation therapy on nerve repair following end-to-end neurorrhaphy. The rat models were divided into the following 3 groups: (1) nerve injury without FIR biostimulation (NI/sham group); (2) nerve injury with FIR biostimulation (NI/FIR group); and (3) noninjured controls (normal group). Walking-track analysis results showed that the NI/FIR group exhibited significantly higher sciatic functional indices at 8 weeks after surgery (P < 0.05) compared with the NI/sham group. The decreased expression of CD4 and CD8 in the NI/FIR group indicated that FIR irradiation modulated the inflammatory process during recovery. Compared with the NI/sham group, the NI/FIR group exhibited a significant reduction in muscle atrophy (P < 0.05). Furthermore, histomorphometric assessment indicated that the nerves regenerated more rapidly in the NI/FIR group than in the NI/sham group; furthermore, the NI/FIR group regenerated neural tissue over a larger area, as well as nerve fibers of greater diameter and with thicker myelin sheaths. Functional recovery, inflammatory response, muscular reinnervation, and histomorphometric assessment all indicated that FIR radiation therapy can accelerate nerve repair following end-to-end neurorrhaphy of the sciatic nerve. PMID:25722734

  19. Effect of swirling flow on platelet concentration distribution in small-caliber artificial grafts and end-to-end anastomoses

    NASA Astrophysics Data System (ADS)

    Zhan, Fan; Fan, Yu-Bo; Deng, Xiao-Yan

    2011-10-01

    Platelet concentration near the blood vessel wall is one of the major factors in the adhesion of platelets to the wall. In our previous studies, it was found that swirling flows could suppress platelet adhesion in small-caliber artificial grafts and end-to-end anastomoses. In order to better understand the beneficial effect of the swirling flow, we numerically analyzed the near-wall concentration distribution of platelets in a straight tube and a sudden tubular expansion tube under both swirling flow and normal flow conditions. The numerical models were created based on our previous experimental studies. The simulation results revealed that when compared with the normal flow, the swirling flow could significantly reduce the near-wall concentration of platelets in both the straight tube and the expansion tube. The present numerical study therefore indicates that the reduction in platelet adhesion under swirling flow conditions in small-caliber arterial grafts, or in end-to-end anastomoses as observed in our previous experimental study, was possibly through a mechanism of platelet transport, in which the swirling flow reduced the near-wall concentration of platelets.

  20. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  1. A GF-Matrix Approach to the End-to-End Coupling in Ethane-like Molecules

    NASA Astrophysics Data System (ADS)

    Dilauro, C.; Lattanzi, F.

    1993-12-01

    We examine the effect of end-to-end coupling on the degenerate vibrational deformations of ethane-like molecules by considering the form of the dependence of the elements of the G and F matrices on the internal rotation angle γ. This can be done by simple geometrical considerations, in a basis of internal vibrational coordinates. After transformation to symmetry coordinates belonging to different species of the G36(EM) extended molecular group, the product G(0)F(0) of the γ-independent parts of the G and F matrices is diagonalized. The resulting zero-order normal modes and their conjugate momenta are used in building up the vibration-torsion Hamiltonian, including the γ-dependent terms. We find that (i) in the case of a low barrier hindering the internal rotation the most convenient sets of degenerate normal coordinates are either Gs (in the case of a weak effect of the end-to-end coupling on the relative deformations) or E1d, E2d; (ii) degenerate vibrational coordinates whose top and frame components have at the least one common atom lead to E1d, E2d normal modes regardless of the barrier height; (iii) "unpaired" degenerate vibrational coordinates, such as the skeletal bending of dimethylzinc, always contribute an E1d normal mode; and (iv) in the case of high or moderate barriers, Gs normal modes are unlikely to occur, and the most probable normal mode symmetries are E1d, E2d.

  2. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    PubMed Central

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  3. End-To-End Risk Assesment: From Genes and Protein to Acceptable Radiation Risks for Mars Exploration

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Schimmerling, Walter

    2000-07-01

    The human exploration of Mars will impose unavoidable health risks from galactic cosmic rays (GCR) and possibly solar particle events (SPE). It is the goal of NASA's Space Radiation Health Program to develop the capability to predict health risks with significant accuracy to ensure that risks are well below acceptable levels and to allow for mitigation approaches to be effective at reasonable costs. End-to-End risk assessment is the approach being followed to understand proton and heavy ion damage at the molecular, cellular, and tissue levels in order to predict the probability of the major health risk including cancer, neurological disorders, hereditary effects, cataracts, and acute radiation sickness and to develop countermeasures for mitigating risks.

  4. NASA End-to-End Data System /NEEDS/ information adaptive system - Performing image processing onboard the spacecraft

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Howle, W. M.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  5. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    NASA Technical Reports Server (NTRS)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  6. Left Ventricular Assist Device End-to-End Connection to the Left Subclavian Artery: An Alternative Technique.

    PubMed

    Bortolussi, Giacomo; Lika, Alban; Bejko, Jonida; Gallo, Michele; Tarzia, Vincenzo; Gerosa, Gino; Bottio, Tomaso

    2015-10-01

    We describe a modified implantation technique for the HeartWare ventricular assist device. We access the apex through a left minithoracotomy. The outflow graft is tunneled through a small incision in the fourth intercostal space and then subcutaneously to the subclavian region. After division of the left axillary artery, an end-to-end anastomosis is performed to the proximal part, and the distal vessel is connected end-to-side through a fenestration in the outflow graft. We believe that this technique, particularly suitable for redo scenarios or severely calcified aorta, achieves a more direct blood flow into the aorta and reduces cerebrovascular events while avoiding excessive flow to the arm. PMID:26434488

  7. The MARS pathfinder end-to-end information system: A pathfinder for the development of future NASA planetary missions

    NASA Technical Reports Server (NTRS)

    Cook, Richard A.; Kazz, Greg J.; Tai, Wallace S.

    1996-01-01

    The development of the Mars pathfinder is considered with emphasis on the End-to-End Information System (EEIS) development approach. The primary mission objective is to successfully develop and deliver a single flight system to the Martian surface, demonstrating entry, descent and landing. The EEIS is a set of functions distributed throughout the flight, ground and Mission Operation Systems (MOS) that inter-operate in order to control, collect, transport, process, store and analyze the uplink and downlink information flows of the mission. Coherence between the mission systems is achieved though the EEIS architecture. The key characteristics of the system are: a concurrent engineering approach for the development of flight, ground and mission operation systems; the fundamental EEIS architectural heuristics; a phased incremental EEIS development and test approach, and an EEIS design deploying flight, ground and MOS operability features, including integrated ground and flight based toolsets.

  8. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V.

    PubMed

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge.Database URL: http://database.oxfordjournals.org/content/2016/baw077. PMID:27270713

  9. End-To-End Risk Assesment: From Genes and Protein to Acceptable Radiation Risks for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Schimmerling, Walter

    2000-01-01

    The human exploration of Mars will impose unavoidable health risks from galactic cosmic rays (GCR) and possibly solar particle events (SPE). It is the goal of NASA's Space Radiation Health Program to develop the capability to predict health risks with significant accuracy to ensure that risks are well below acceptable levels and to allow for mitigation approaches to be effective at reasonable costs. End-to-End risk assessment is the approach being followed to understand proton and heavy ion damage at the molecular, cellular, and tissue levels in order to predict the probability of the major health risk including cancer, neurological disorders, hereditary effects, cataracts, and acute radiation sickness and to develop countermeasures for mitigating risks.

  10. End-to-end stacking and liquid crystal condensation of 6- to 20-base pair DNA duplexes.

    SciTech Connect

    Nakata, M.; Zanchetta, G.; Chapman, B.D.; Christopher, D.; Jones, D.; Cross, J.O.; Pindak, R.; Bellini, T.; Noel, N.; X-Ray Science Division; Univ. of Colorado; Univ. di Milano; BNL

    2007-11-23

    Short complementary B-form DNA oligomers, 6 to 20 base pairs in length, are found to exhibit nematic and columnar liquid crystal phases, even though such duplexes lack the shape anisotropy required for liquid crystal ordering. Structural study shows that these phases are produced by the end-to-end adhesion and consequent stacking of the duplex oligomers into polydisperse anisotropic rod-shaped aggregates, which can order into liquid crystals. Upon cooling mixed solutions of short DNA oligomers, in which only a small fraction of the DNA present is complementary, the duplex-forming oligomers phase-separate into liquid crystal droplets, leaving the unpaired single strands in isotropic solution. In a chemical environment where oligomer ligation is possible, such ordering and condensation would provide an autocatalytic link whereby complementarity promotes the extended polymerization of complementary oligomers.

  11. The initial data products from the EUVE software - A photon's journey through the End-to-End System

    NASA Technical Reports Server (NTRS)

    Antia, Behram

    1993-01-01

    The End-to-End System (EES) is a unique collection of software modules created for use at the Center for EUV Astrophysics. The 'pipeline' is a shell script which executes selected EES modules and creates initial data products: skymaps, data sets for individual sources (called 'pigeonholes') and catalogs of sources. This article emphasizes the data from the all-sky survey, conducted between July 22, 1992 and January 21, 1993. A description of each of the major data products will be given and, as an example of how the pipeline works, the reader will follow a photon's path through the software pipeline into a pigeonhole. These data products are the primary goal of the EUVE all-sky survey mission, and so their relative importance for the follow-up science will also be discussed.

  12. Minimization of outage probability of WiMAX link supported by laser link between a high-altitude platform and a satellite.

    PubMed

    Arnon, Shlomi

    2009-07-01

    Various technologies for the implementation of a WiMAX (IEEE802.16) base station on board a high-altitude platform (HAP) are currently being researched. The network configuration under consideration includes a satellite, several HAPs, and subscribers on the ground. The WiMAX base station is positioned on the satellite and connects with the HAP via an analog RF over-laser communication (LC) link. The HAPs house a transparent transponder that converts the optic signal to a WiMAX RF signal and the reverse. The LC system consists of a laser transmitter and an optical receiver that need to be strictly aligned to achieve a line-of-sight link. However, mechanical vibration and electronic noise in the control system challenge the transmitter-receiver alignment and cause pointing errors. The outcome of pointing errors is fading of the received signal, which leads to impaired link performance. In this paper, we derive the value of laser transmitter gain that can minimize the outage probability of the WiMAX link. The results indicate that the optimum value of the laser transmitter gain is not a function of the pointing error statistics. PMID:19568289

  13. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    SciTech Connect

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; Grigoriev, Maxim; Haro, Felipe; Nazir, Fawad; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our detection

  14. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  15. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  16. Towards end-to-end models for investigating the effects of climate and fishing in marine ecosystems

    NASA Astrophysics Data System (ADS)

    Travers, M.; Shin, Y.-J.; Jennings, S.; Cury, P.

    2007-12-01

    End-to-end models that represent ecosystem components from primary producers to top predators, linked through trophic interactions and affected by the abiotic environment, are expected to provide valuable tools for assessing the effects of climate change and fishing on ecosystem dynamics. Here, we review the main process-based approaches used for marine ecosystem modelling, focusing on the extent of the food web modelled, the forcing factors considered, the trophic processes represented, as well as the potential use and further development of the models. We consider models of a subset of the food web, models which represent the first attempts to couple low and high trophic levels, integrated models of the whole ecosystem, and size spectrum models. Comparisons within and among these groups of models highlight the preferential use of functional groups at low trophic levels and species at higher trophic levels and the different ways in which the models account for abiotic processes. The model comparisons also highlight the importance of choosing an appropriate spatial dimension for representing organism dynamics. Many of the reviewed models could be extended by adding components and by ensuring that the full life cycles of species components are represented, but end-to-end models should provide full coverage of ecosystem components, the integration of physical and biological processes at different scales and two-way interactions between ecosystem components. We suggest that this is best achieved by coupling models, but there are very few existing cases where the coupling supports true two-way interaction. The advantages of coupling models are that the extent of discretization and representation can be targeted to the part of the food web being considered, making their development time- and cost-effective. Processes such as predation can be coupled to allow the propagation of forcing factors effects up and down the food web. However, there needs to be a stronger focus

  17. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  18. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    PubMed

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification. PMID:26189015

  19. Mapping Water Vapor Bands using AIRS Measurements for NPOESS/NPP VIIRS Pre-launch End-to-End Testing

    NASA Astrophysics Data System (ADS)

    Qu, J. J.; Hao, X.; Hauss, B.; Wang, C.; Xiong, J.

    2005-12-01

    NPOESS/NPP pre-launch end to end testing is very important for establishing the long-term high quality Environmental Data Records (EDRs). In our early studies, we have developed spatial and spectral mapping technology and demonstrated the AIRS-MODIS-VIIRS band mapping approaches successfully. In this paper, we will focus on VIIRS water vapor band mapping for proxy dataset generating based on our recently established proxy database which includes the AIRS simulated MODIS, AIRS simulated VIIRS and aggregated MODIS radiances/ brightness temperatures. We demonstrate the efficacy of this approach by presenting results of the cross-comparison of water vapor band measurements from AIRS, MODIS and simulated VIIRS. We also investigate the dependence of the quality of water vapor band mapping as a function of the surface emissivity spectrum, phenomenology, and atmospheric conditions. The same approach can be used to map CrIS to VIIRS for post-launch calibration and validation. It is also valuable to keep the continuity between MODIS and VIIRS water vapor measurements. This approach can provide increased confidence in evaluating EDR retrieval algorithms performances. It also can be used to map 6.75 μm band using AIRS or CrIS measurements for water vapor algorithm testing.

  20. Hardware and Methods of the Optical End-to-End Test of the Far Ultraviolet Spectroscopic Explorer (FUSE)

    NASA Technical Reports Server (NTRS)

    Conard, Steven J.; Redman, Kevin W.; Barkhouser, Robert H.; McGuffey, Doug B.; Smee, Stephen; Ohl, Raymond G.; Kushner, Gary

    1999-01-01

    The Far Ultraviolet Spectroscopic Explorer (FUSE), currently being tested and scheduled for a 1999 launch, is an astrophysics satellite designed to provide high spectral resolving power (Lambda/(Delta)Lambda = 24,000-30,000) over the interval 90.5-118.7 nm. The FUSE optical path consists of four co-aligned, normal incidence, off-axis parabolic, primary mirrors which illuminate separate Rowland circle spectrograph channels equipped with holographic gratings and delay line microchannel plate detectors. We describe the hardware and methods used for the optical end-to-end test of the FUSE instrument during satellite integration and test. Cost and schedule constraints forced us to devise a simplified version of the planned optical test which occurred in parallel with satellite thermal-vacuum testing. The optical test employed a collimator assembly which consisted of four co-aligned, 15" Cassegrain telescopes which were positioned above the FUSE instrument, providing a collimated beam for each optical channel. A windowed UV light source, remotely adjustable in three axes, was mounted at the focal plane of each collimator. Problems with the UV light sources, including high F-number and window failures, were the only major difficulties encountered during the test. The test succeeded in uncovering a significant problem with the secondary structure used for the instrument closeout cavity and, furthermore, showed that the mechanical solution was successful. The hardware was also used extensively for simulations of science observations, providing both UV light for spectra and visible light for the fine error sensor camera.

  1. A novel end-to-end fault detection and localization protocol for wavelength-routed WDM networks

    NASA Astrophysics Data System (ADS)

    Zeng, Hongqing; Vukovic, Alex; Huang, Changcheng

    2005-09-01

    Recently the wavelength division multiplexing (WDM) networks are becoming prevalent for telecommunication networks. However, even a very short disruption of service caused by network faults may lead to high data loss in such networks due to the high date rates, increased wavelength numbers and density. Therefore, the network survivability is critical and has been intensively studied, where fault detection and localization is the vital part but has received disproportional attentions. In this paper we describe and analyze an end-to-end lightpath fault detection scheme in data plane with the fault notification in control plane. The endeavor is focused on reducing the fault detection time. In this protocol, the source node of each lightpath keeps sending hello packets to the destination node exactly following the path for data traffic. The destination node generates an alarm once a certain number of consecutive hello packets are missed within a given time period. Then the network management unit collects all alarms and locates the faulty source based on the network topology, as well as sends fault notification messages via control plane to either the source node or all upstream nodes along the lightpath. The performance evaluation shows such a protocol can achieve fast fault detection, and at the same time, the overhead brought to the user data by hello packets is negligible.

  2. Objective end-to-end (mouth-to-ear) conversational speech quality tests for VoIP scenarios

    NASA Astrophysics Data System (ADS)

    Kettler, Frank; Gierlich, Hans W.

    2001-07-01

    From the speech quality point of view the differentiation between terminals and network in communications over IP is no longer possible. Consequently the overall speech quality assessment has to take this into account and requires end-to-end tests. Suitable test setups including the terminals acoustics using artificial head technology as a close to reality interface are introduced. In a second part the influence of various subjectively relevant parameters on speech quality is discussed. Correlated objec-tive parameters like delay, echo, double talk capability, listening speech quality and parameters determining background noise transmission quality are described. Appropriate analysis methods are given. The discussion points out the influence of delay on conversation dynamics impairments and its influence on echo perception, because the expected delay in VoIP sce-narios is probably higher than typically recommended for telephone conversations. Optimization criteria are introduced for implemented echo cancellers as well as test methods to assess the one-way speech sound quality, double talk performance and background noise transmission.

  3. End-to-end simulation of high-contrast imaging systems: methods and results for the PICTURE mission family

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Hewasawam, Kuravi; Mendillo, Christopher B.; Cahoy, Kerri L.; Cook, Timothy A.; Finn, Susanna C.; Howe, Glenn A.; Kuchner, Marc J.; Lewis, Nikole K.; Marinan, Anne D.; Mawet, Dimitri; Chakrabarti, Supriya

    2015-09-01

    We describe a set of numerical approaches to modeling the performance of space flight high-contrast imaging payloads. Mission design for high-contrast imaging requires numerical wavefront error propagation to ensure accurate component specifications. For constructed instruments, wavelength and angle-dependent throughput and contrast models allow detailed simulations of science observations, allowing mission planners to select the most productive science targets. The PICTURE family of missions seek to quantify the optical brightness of scattered light from extrasolar debris disks via several high-contrast imaging techniques: sounding rocket (the Planet Imaging Concept Testbed Using a Rocket Experiment) and balloon flights of a visible nulling coronagraph, as well as a balloon flight of a vector vortex coronagraph (the Planetary Imaging Concept Testbed Using a Recoverable Experiment - Coronagraph, PICTURE-C). The rocket mission employs an on-axis 0.5m Gregorian telescope, while the balloon flights will share an unobstructed off-axis 0.6m Gregorian. This work details the flexible approach to polychromatic, end-to-end physical optics simulations used for both the balloon vector vortex coronagraph and rocket visible nulling coronagraph missions. We show the preliminary PICTURE-C telescope and vector vortex coronagraph design will achieve 10-8 contrast without post-processing as limited by realistic optics, but not considering polarization or low-order errors. Simulated science observations of the predicted warm ring around Epsilon Eridani illustrate the performance of both missions.

  4. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    NASA Astrophysics Data System (ADS)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  5. Results from solar reflective band end-to-end testing for VIIRS F1 sensor using T-SIRCUS

    NASA Astrophysics Data System (ADS)

    McIntire, Jeff; Moyer, David; McCarthy, James K.; Brown, Steven W.; Lykke, Keith R.; De Luccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce

    2011-10-01

    Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor onorbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Flight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS (Traveling Spectral Irradiance and Radiance Responsivity Calibrations using Uniform Sources) was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD Bidirectional Reflectance Factor (BRF) by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.

  6. An unusual structural motif of antimicrobial peptides containing end-to-end macrocycle and cystine-knot disulfides.

    PubMed

    Tam, J P; Lu, Y A; Yang, J L; Chiu, K W

    1999-08-01

    Four macrocyclic cystine-knot peptides of 29-31 residues, kalata, circulin A and B (CirA and CirB), and cyclopsychotride, have been isolated from coffee plants but have undetermined physiological functions. These macrocycles and 10 of their analogs prepared by chemical synthesis were tested against nine strains of microbes. Kalata and CirA were specific for the Gram-positive Staphylococcus aureus with a minimum inhibition concentration of approximately 0.2 microM. They were relatively ineffective against Gram-negative bacteria such as Escherichia coli and Pseudomonas aeruginosa. However, CirB and cyclopsychotride were active against both Gram-positive and Gram-negative bacteria. In particular, CirB showed potent activity against E. coli with a minimum inhibitory concentration of 0.41 microM. All four cyclic peptides were moderately active against two strains of fungi, Candida kefyr and Candida tropicalis, but were inactive against Candida albicans. These macrocycles are cytotoxic and lysed human red blood cell with a lethal dose 50% of 400 microM. Modifying the Arg residue in kalata with a keto aldehyde significantly reduced its activity against S. aureus whereas blocking the arg in CirA produced no significant effect. The two-disulfide variants and their scrambled disulfide isomers exhibited antimicrobial profiles and potency similar to their native peptides. However, in high-salt assays (100 mM NaCl), few of these macrocyclic peptides, natives or analogs, retained antimicrobial activity. These results show that the macrocyclic peptides possess specific and potent antimicrobial activity that is salt-dependent and that their initial interactions with the microbial surfaces may be electrostatic, an effect commonly found in defensin antimicrobial peptides. Furthermore, their end-to-end cyclic structure with a cystine-knot motif represents a molecular structure of antimicrobials and may provide a useful template for the design of novel peptide antibiotics. PMID

  7. End-to-end gene fusions and their impact on the production of multifunctional biomass degrading enzymes

    SciTech Connect

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2012-11-09

    Highlights: Black-Right-Pointing-Pointer Multifunctional enzymes offer an interesting approach for biomass degradation. Black-Right-Pointing-Pointer Size and conformation of separate constructs play a role in the effectiveness of chimeras. Black-Right-Pointing-Pointer A connecting linker allows for maximal flexibility and increased thermostability. Black-Right-Pointing-Pointer Genes with functional similarities are the best choice for fusion candidates. -- Abstract: The reduction of fossil fuels, coupled with its increase in price, has made the search for alternative energy resources more plausible. One of the topics gaining fast interest is the utilization of lignocellulose, the main component of plants. Its primary constituents, cellulose and hemicellulose, can be degraded by a series of enzymes present in microorganisms, into simple sugars, later used for bioethanol production. Thermophilic bacteria have proven to be an interesting source of enzymes required for hydrolysis since they can withstand high and denaturing temperatures, which are usually required for processes involving biomass degradation. However, the cost associated with the whole enzymatic process is staggering. A solution for cost effective and highly active production is through the construction of multifunctional enzyme complexes harboring the function of more than one enzyme needed for the hydrolysis process. There are various strategies for the degradation of complex biomass ranging from the regulation of the enzymes involved, to cellulosomes, and proteins harboring more than one enzymatic activity. In this review, the construction of multifunctional biomass degrading enzymes through end-to-end gene fusions, and its impact on production and activity by choosing the enzymes and linkers is assessed.

  8. End-to-end sensor simulation for spectral band selection and optimization with application to the Sentinel-2 mission.

    PubMed

    Segl, Karl; Richter, Rudolf; Küster, Theres; Kaufmann, Hermann

    2012-02-01

    An end-to-end sensor simulation is a proper tool for the prediction of the sensor's performance over a range of conditions that cannot be easily measured. In this study, such a tool has been developed that enables the assessment of the optimum spectral resolution configuration of a sensor based on key applications. It employs the spectral molecular absorption and scattering properties of materials that are used for the identification and determination of the abundances of surface and atmospheric constituents and their interdependence on spatial resolution and signal-to-noise ratio as a basis for the detailed design and consolidation of spectral bands for the future Sentinel-2 sensor. The developed tools allow the computation of synthetic Sentinel-2 spectra that form the frame for the subsequent twofold analysis of bands in the atmospheric absorption and window regions. One part of the study comprises the assessment of optimal spatial and spectral resolution configurations for those bands used for atmospheric correction, optimized with regard to the retrieval of aerosols, water vapor, and the detection of cirrus clouds. The second part of the study presents the optimization of thematic bands, mainly driven by the spectral characteristics of vegetation constituents and minerals. The investigation is performed for different wavelength ranges because most remote sensing applications require the use of specific band combinations rather than single bands. The results from the important "red-edge" and the "short-wave infrared" domains are presented. The recommended optimum spectral design predominantly confirms the sensor parameters given by the European Space Agency. The system is capable of retrieving atmospheric and geobiophysical parameters with enhanced quality compared to existing multispectral sensors. Minor spectral changes of single bands are discussed in the context of typical remote sensing applications, supplemented by the recommendation of a few new bands for

  9. Results from Solar Reflective Band End-to-End Testing for VIIRS F1 Sensor Using T-SIRCUS

    NASA Technical Reports Server (NTRS)

    McIntire, Jeff; Moyer, David; McCarthy, James K.; DeLuccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce

    2011-01-01

    Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor on-orbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Fight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD BRF by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.

  10. Computational simulation of flow in the end-to-end anastomosis of a rigid graft and a compliant artery.

    PubMed

    Qiu, Y; Tarbell, J M

    1996-01-01

    Implanted vascular grafts often fail because of the development of intimal hyperplasia in the anastomotic region, and compliance mismatch between the host artery and graft exacerbates the problem. This study focused on the effects of radial artery wall motion and phase angle between pressure and flow waves (impedance phase angle [IPA]) on the wall shear rate (WSR) behavior near end-to-end vascular graft anastomoses models connecting rigid grafts and compliant arteries. A finite element model with transient flow and moving boundaries was set up to simulate oscillatory flow through a 16% undersized (mean) diameter graft model. During the simulations, different artery diameter variations (DVs) over a cycle (DV) and IPAs were simulated in the physiologic range for an oscillatory flow (mean Re = 150, peak Re = 300, unsteadiness parameter alpha = 3.9). The results show that for normal physiologic conditions (DV = 6%, IPA = -45 degrees) in a 16% undersized graft, the minimum distal mean WSR is reduced by 60% compared to steady flow at the mean Re; the minimum distal WSR amplitude increases 50% when IPA changes from -5 degrees to -85 degrees, and increases 60% when DV changes from 2% to 10%. This indicates that compliance mismatch induces lower mean WSR and more oscillatory WSR in the distal anastomotic region, which may contribute to intimal hyperplasia. In addition, the convergent-divergent geometry of the 16% undersized graft model can significantly affect the force pattern applied to the local endothelial cell layer near the anastomosis by altering the local phase angle between the flow induced tangential force (synchronous with WSR) and the radial artery expansion induced cyclic hoop strain (synchronous with DV). This local phase angle is decreased by 65 degrees in the distal divergent geometry, while increased by 15 degrees in the proximal convergent geometry. PMID:8944971

  11. Designing an End-to-End System for Data Storage, Analysis, and Visualization for an Urban Environmental Observatory

    NASA Astrophysics Data System (ADS)

    McGuire, M. P.; Welty, C.; Gangopadhyay, A.; Karabatis, G.; Chen, Z.

    2006-05-01

    The urban environment is formed by complex interactions between natural and human dominated systems, the study of which requires the collection and analysis of very large datasets that span many disciplines. Recent advances in sensor technology and automated data collection have improved the ability to monitor urban environmental systems and are making the idea of an urban environmental observatory a reality. This in turn has created a number of potential challenges in data management and analysis. We present the design of an end-to-end system to store, analyze, and visualize data from a prototype urban environmental observatory based at the Baltimore Ecosystem Study, a National Science Foundation Long Term Ecological Research site (BES LTER). We first present an object-relational design of an operational database to store high resolution spatial datasets as well as data from sensor networks, archived data from the BES LTER, data from external sources such as USGS NWIS, EPA Storet, and metadata. The second component of the system design includes a spatiotemporal data warehouse consisting of a data staging plan and a multidimensional data model designed for the spatiotemporal analysis of monitoring data. The system design also includes applications for multi-resolution exploratory data analysis, multi-resolution data mining, and spatiotemporal visualization based on the spatiotemporal data warehouse. Also the system design includes interfaces with water quality models such as HSPF, SWMM, and SWAT, and applications for real-time sensor network visualization, data discovery, data download, QA/QC, and backup and recovery, all of which are based on the operational database. The system design includes both internet and workstation-based interfaces. Finally we present the design of a laboratory for spatiotemporal analysis and visualization as well as real-time monitoring of the sensor network.

  12. Pre-Launch End-to-End Testing Plans for the SPAce Readiness Coherent Lidar Experiment (SPARCLE)

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.

    1999-01-01

    The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing. attitude information from lidar and non-lidar sensors, and pointing knowledge algorithms will

  13. Volumetric-Modulated Arc Therapy: Effective and Efficient End-to-End Patient-Specific Quality Assurance

    SciTech Connect

    O'Daniel, Jennifer; Das, Shiva; Wu, Q. Jackie; Yin Fangfang

    2012-04-01

    Purpose: To explore an effective and efficient end-to-end patient-specific quality-assurance (QA) protocol for volumetric modulated arc radiotherapy (VMAT) and to evaluate the suitability of a stationary radiotherapy QA device (two-dimensional [2D] ion chamber array) for VMAT QA. Methods and Materials: Three methods were used to analyze 39 VMAT treatment plans for brain, spine, and prostate: ion chamber (one-dimensional absolute, n = 39), film (2D relative, coronal/sagittal, n = 8), and 2D ion chamber array (ICA, 2D absolute, coronal/sagittal, n = 39) measurements. All measurements were compared with the treatment planning system dose calculation either via gamma analysis (3%, 3- to 4-mm distance-to-agreement criteria) or absolute point dose comparison. The film and ion chamber results were similarly compared with the ICA measurements. Results: Absolute point dose measurements agreed well with treatment planning system computed doses (ion chamber: median deviation, 1.2%, range, -0.6% to 3.3%; ICA: median deviation, 0.6%, range, -1.8% to 2.9%). The relative 2D dose measurements also showed good agreement with computed doses (>93% of pixels in all films passing gamma, >90% of pixels in all ICA measurements passing gamma). The ICA relative dose results were highly similar to those of film (>90% of pixels passing gamma). The coronal and sagittal ICA measurements were statistically indistinguishable by the paired t test with a hypothesized mean difference of 0.1%. The ion chamber and ICA absolute dose measurements showed a similar trend but had disparities of 2-3% in 18% of plans. Conclusions: After validating the new VMAT implementation with ion chamber, film, and ICA, we were able to maintain an effective yet efficient patient-specific VMAT QA protocol by reducing from five (ion chamber, film, and ICA) to two measurements (ion chamber and single ICA) per plan. The ICA (Matrixx Registered-Sign , IBA Dosimetry) was validated for VMAT QA, but ion chamber measurements are

  14. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  15. SBSS Demonstrator: A design for efficient demonstration of Space-based Space Surveillance end-to-end capabilities

    NASA Astrophysics Data System (ADS)

    Utzmann, Jens; Flohrer, Tim; Schildknecht, Thomas; Wagner, Axel; Silha, Jiri; Willemsen, Philip; Teston, Frederic

    This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. Space Surveillance and Tracking is part of Space Situational Awareness (SSA) and covers the detection, tracking and cataloguing of space debris and satellites. Derived SST services comprise a catalogue of these man-made objects, collision warning, detection and characterisation of in-orbit fragmentations, sub-catalogue debris characterisation, etc. The assessment of SBSS in a SST system architecture has shown that both an operational SBSS and also already a well-designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing and fusion, etc.) until the final products can be offered to the users. Also past and current missions by the US (SBV, SBSS) and Canada (Sapphire, NEOSSat) underline the advantages of space-based space surveillance. The presented SBSS system concept takes the ESA SST System Requirements (derived within the ESA SSA Preparatory Program) into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirments for detection and characterisation of small-sized LEO debris are

  16. WE-G-BRD-08: End-To-End Targeting Accuracy of the Gamma Knife for Trigeminal Neuralgia

    SciTech Connect

    Brezovich, I; Wu, X; Duan, J; Benhabib, S; Huang, M; Shen, S; Cardan, R; Popple, R

    2014-06-15

    Purpose: Current QA procedures verify accuracy of individual equipment parameters, but may not include CT and MRI localizers. This study uses an end-to-end approach to measure the overall targeting errors in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 inch) diameter MRI contrast-filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the cavity position matches the Gamma Knife coordinates of 10 previously treated patients. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pin prick to identify the cavity center. Treatments are planned for delivery with 4 mm collimators using MRI and CT scans acquired with the clinical localizer boxes and acquisition protocols. Coordinates of shots are chosen so that the cavity is centered within the 50% isodose volume. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pin prick and the centroid of the 50% isodose line. Results: Averaged over 10 patient simulations, targeting errors along the x, y and z coordinates (patient left-to-right, posterior-anterior, head-to-foot) were, respectively, −0.060 +/− 0.363, −0.350 +/− 0.253, and 0.364 +/− 0.191 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely 0.109 +/− 0.167, −0.191 +/− 0.144, and 0.211 +/− 0.94 mm. The largest errors in MRI and CT planned treatments were, respectively, y = −0.761 and x = 0.428 mm. Conclusion: Unless patient motion or stronger MRI image distortion in actual treatments caused additional errors, all patients received the prescribed dose, i.e., the targeted section of the trig±eminal nerve was contained within the 50% isodose surface in all cases.

  17. END-TO-END VERSUS END-TO-SIDE ANASTOMOSIS IN THE TREATMENT OF ESOPHAGEAL ATRESIA OR TRACHEO-ESOPHAGEAL FISTULA

    PubMed Central

    ASKARPOUR, Shahnam; OSTADIAN, Nasrollah; PEYVASTEH, Mehran; ALAVI, Mostafa; JAVAHERIZADEH, Hazhir

    2016-01-01

    Background : Dehiscence of esophageal anastomosis is frequent and there are still controversies which type of anastomosis is preferred to diminish its incidence Aim : To compare end-to-end anastomosis versus end-to-side anastomosis in terms of anastomotic leakage, esophageal stricture and gastroesophageal reflux symptom. Methods : This study was carried out for two year starting from 2012. End-to-side and end-to-side anastomosis were compared in terms of anastomotic leakage, esophageal stricture, gastroesophageal reflux symptom, length of surgery and pack cell infusion. Results : Respectively to end-to-end and end-to-side anastomosis, duration of surgery was 127.63±13.393 minutes and 130.29±10.727 minutes (p=0.353); esophageal stricture was noted in two (5.9%) and eight (21.1%) cases (p=0.09); gastroesophageal reflux disease was detected in six (15.8%) and three (8.8%) cases (p=0.485); anastomotic leakage was found in five (13.2%) and one (2.9%) cases (p=0.203); duration of neonatal intensive care unit admission was significantly shorter in end-to-end (11.05±2.438 day) compared to end-to-side anastomosis (13.88±2.306 day) (p<0.0001). Conclusion : There were no significant differences between end-to-end and end-to-side anastomosis except for length of neonatal intensive care unit admission which was significantly shorter in end-to-end anastomosis group. PMID:27120740

  18. Development and evaluation of an end-to-end test for head and neck IMRT with a novel multiple-dosimetric modality phantom.

    PubMed

    Zakjevskii, Viatcheslav V; Knill, Cory S; Rakowski, Joseph T; Snyder, Michael G

    2016-01-01

    A comprehensive end-to-end test for head and neck IMRT treatments was developed using a custom phantom designed to utilize multiple dosimetry devices. Initial end-to-end test and custom H&N phantom were designed to yield maximum informa-tion in anatomical regions significant to H&N plans with respect to: (i) geometric accuracy, (ii) dosimetric accuracy, and (iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. The phantom was imaged on a CT simulator and the CT was reconstructed with 1 mm slice thick-ness and imported into Varian's Eclipse treatment planning system. OARs and the PTV were contoured with the aid of Smart Segmentation. A clinical template was used to create an eight-field IMRT plan and dose was calculated with heterogeneity correction on. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film, ion chambers, and optically stimulated luminescent dosimeters (OSLDs). Ion chamber dose measure-ments were compared to the treatment planning system. Films were analyzed with FilmQA Pro using composite gamma index. OSLDs were read with a MicroStar reader using a custom calibration curve. Final phantom design incorporated two axial and one coronal film planes with 18 OSLD locations adjacent to those planes as well as four locations for IMRT ionization chambers below inferior film plane. The end-to-end test was consistently reproducible, resulting in average gamma pass rate greater than 99% using 3%/3 mm analysis criteria, and average OSLD and ion chamber measurements within 1% of planned dose. After initial calibration of OSLD and film systems, the end-to-end test provides next-day results, allowing for integration in routine clinical QA. Preliminary trials have demonstrated that our end-to-end is a reproducible QA tool that enables the ongoing evaluation of dosimetric and geometric accuracy of clinical head and neck treatments

  19. Modelling and simulation of the mechanical response of a Dacron graft in the pressurization test and an end-to-end anastomosis.

    PubMed

    Bustos, Claudio A; García-Herrera, Claudio M; Celentano, Diego J

    2016-08-01

    This work presents the modeling and simulation of the mechanical response of a Dacron graft in the pressurization test and its clinical application in the analysis of an end-to-end anastomosis. Both problems are studied via an anisotropic constitutive model that was calibrated by means of previously reported uniaxial tensile tests. First, the simulation of the pressurization test allows the validation of the experimental material characterization that included tests carried out for different levels of axial stretching. Then, the analysis of an end-to-end anastomosis under an idealized geometry is proposed. This case consists in evaluating the mechanical performance of the graft together with the stresses and deformations in the neighborhood of the Dacron with the artery. This research contributes important data to understand the functioning of the graft and the possibility of extending the analysis to complex numerical cases like its insertion in the aortic arch. PMID:26826765

  20. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    NASA Astrophysics Data System (ADS)

    Brewka, Lukasz; Gavler, Anders; Wessing, Henrik; Dittmann, Lars

    2012-04-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.

  1. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  2. An end-to-end system in support of a broad scope of GOES-R sensor and data processing study

    NASA Astrophysics Data System (ADS)

    Huang, Hung-Lung

    2005-08-01

    The mission of NOAA's Geostationary Operational Environmental Satellite System (GOES) R series satellites, in the 2012 time frame, is to provide continuous, near real-time meteorological, oceanographic, solar, and space environment data that supports NOAA's strategic mission goals. It presents an exciting opportunity to explore new instruments, satellite designs, and system architectures utilizing new communication and instrument technologies in order to meet the ever-increasing demands made of Earth observation systems by national agencies and end users alike. The GOES-R sensor suite includes a 16 spectral band Advanced Baseline Imager (ABI), an approximately 1500 high spectral resolution band Hyperspectral Environmental Suite (HES), plus other sensors designed to detect lightning and to explore the ocean, solar and space environment. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) as part of the Space Science and Engineering Center (SSEC) of the University of Wisconsin-Madison, the long time partner of NOAA, has developed the first operational end-to-end processing system for GOES. Based on this heritage, and with recent support from the NASA/NOAA Geosynchrous Imaging FTS (GIFTS) project, the Navy's Multiple University Research Initiative (MURI), and NOAA's GOES-R Risk Reduction program, SSEC has built a near-complete end-to-end system that is capable of simulating sensor measurements from top of atmosphere radiances, raw sensor data (level 0) through calibrated and navigated sensor physical measurements (level 1) to the processed products (level 2). In this paper, the SSEC Hyperspectral Imaging and Sounding Simulator and Processor (HISSP) will be presented in detail. HISSP is capable of demonstrating most of the processing functions such as data compression/decompression, sensor calibration, data processing, algorithm development, and product generation. In summary, HISSP is an end-to-end system designed to support both government and

  3. On-Orbit Performance Verification and End-To-End Characterization of the TDRS-H Ka-band Communications Payload

    NASA Technical Reports Server (NTRS)

    Toral, Marco; Wesdock, John; Kassa, Abby; Pogorelc, Patsy; Jenkens, Robert (Technical Monitor)

    2002-01-01

    In June 2000, NASA launched the first of three next generation Tracking and Data Relay Satellites (TDRS-H) equipped with a Ka-band forward and return service capability. This Ka-band service supports forward data rates of up to 25 Mb/sec using the 22.55-23.55 GHz space-to-space allocation. Return services are supported via channel bandwidths of 225 and 650 MHz for data rates up to at least 800 Mb/sec using the 25.25 - 27.5 GHz space-to-space allocation. As part of NASA's acceptance of the TDRS-H spacecraft, an extensive on-orbit calibration, verification and characterization effort was performed to ensure that on-orbit spacecraft performance is within specified limits. This process verified the compliance of the Ka-band communications payload with all performance specifications, and demonstrated an end-to-end Ka-band service capability. This paper summarizes the results of the TDRS-H Ka-band communications payload on-orbit performance verification and end-to-end service characterization. Performance parameters addressed include antenna gain pattern, antenna Gain-to-System Noise Temperature (G/T), Effective Isotropically Radiated Power (EIRP), antenna pointing accuracy, frequency tunability, channel magnitude response, and Ka-band service Bit-Error-Rate (BER) performance.

  4. On-Orbit Performance Verification and End-to-End Characterization of the TDRS-H Ka-Band Communications Payload

    NASA Technical Reports Server (NTRS)

    Toral, Marco; Wesdock, John; Kassa, Abby; Pogorelc, Patsy; Jenkens, Robert (Technical Monitor)

    2002-01-01

    In June 2000, NASA launched the first of three next generation Tracking and Data Relay Satellites (TDRS-H) equipped with a Ka-band forward and return service capability. This Ka-band service supports forward data rates up to 25 Mb/sec using the 22.55 - 23.55 GHz space-to-space allocation. Return services are supported via channel bandwidths of 225 and 650 MHz for data rates up to 800 Mb/sec (QPSK) using the 25.25 - 27.5 GHz space-to-space allocation. As part of NASA's acceptance of the TDRS-H spacecraft, an extensive on-orbit calibration, verification and characterization effort was performed to ensure that on-orbit spacecraft performance is within specified limits. This process verified the compliance of the Ka-band communications payload with all performance specifications and demonstrated an end-to-end Ka-band service capability. This paper summarizes the results of the TDRS-H Ka-band communications payload on-orbit performance verification and end-to-end service characterization. Performance parameters addressed include Effective Isotropically Radiated Power (EIRP), antenna Gain-to-System Noise Temperature (G/T), antenna gain pattern, frequency tunability and accuracy, channel magnitude response, and Ka-band service Bit-Error-Rate (BER) performance.

  5. User-oriented end-to-end transport protocols for the real-time distribution of telemetry data from NASA spacecraft

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1979-01-01

    A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.

  6. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    NASA Astrophysics Data System (ADS)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  7. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    NASA Astrophysics Data System (ADS)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  8. End to end assembly of CaO and ZnO nanosheets to propeller-shaped architectures by orientation attachment approaches

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Liu, Fang

    2015-06-01

    Inspired by the agitation effect of propellers, heterogeneous propeller- shaped CaO/ZnO architectures were assembled in aqueous solution. Preferred nucleation and growth of CaO and ZnO nuclei resulted in their hexagonal nanosheets, and they were end to end combined into propeller-shaped architectures by oriented rotation and attachment reactions. When propeller-shaped CaO/ZnO product was used as solid base catalyst to synthesize biodiesel, a high biodiesel yield of 97.5% was achieved. The predominant exposure of active O2- on CaO(0 0 2) and ZnO(0 0 0 2) planes in propeller-shaped CaO/ZnO, led to good catalytic activity and high yield of biodiesel.

  9. Clinical evaluation of a closed, one-stage, stapled, functional, end-to-end jejuno-ileal anastomosis in 5 horses

    PubMed Central

    Anderson, Stacy L.; Blackford, James T.; Kelmer, S. Gal

    2012-01-01

    This study describes the outcome and complications in horses that had a closed, one-stage, stapled, functional, end-to-end (COSFE) jejuno-ileal anastomosis (JIA) following resection of compromised small intestine. Medical records were reviewed to identify all horses that had a COSFE JIA performed during exploratory laparotomy and to determine post-operative complications and final outcome. All 5 horses that were identified had successful COSFE JIA with resection of various amounts of distal jejunum and proximal ileum. Post-operative ileus occurred in 1 of the 5 horses. All horses survived at least 1 year after surgery. The survival times and incidence of post-operative ileus compared favorably with published results for other types of small intestinal resection and anastomoses. A COSFE JIA is a viable surgical procedure to correct lesions of the distal jejunum and proximal ileum. PMID:23450864

  10. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Ionospheric Cusp and Resulting Ion Outflow to the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Coffey, Victoria; Chandler, Michael; Singh, Nagendra; Avanov, Levon

    2003-01-01

    We will show results from an end-to-end study of the energy transfer from injected magnetosheath plasmas to the near-Earth magnetospheric and ionospheric plasmas and the resulting ion outflow to the magnetosphere. This study includes modeling of the evolution of the magnetosheath precipitation in the cusp using a kinetic code with a realistic magnetic field configuration. These evolved, highly non-Maxwellian distributions are used as input to a 2D PIC code to analyze the resulting wave generation. The wave analysis is used in the kinetic code as input to the cold ionospheric ions to study the transfer of energy to these ions and their outflow to the magnetosphere. Observations from the Thermal Ion Dynamics Experiment (TIDE) and other instruments on the Polar Spacecraft will be compared to the modeling.

  11. An end-to-end approach to the EUCLID NISP on-board pre-processing operations: tests and latest results

    NASA Astrophysics Data System (ADS)

    Bonoli, Carlotta; Bortoletto, Favio; D'Alessandro, Maurizio; Corcione, Leonardo; Ligori, Sebastiano; Nicastro, Luciano; Trifoglio, Massimo; Valenziano, Luca; Zerbi, Filippo M.; Crouzet, Pierre-Elie; Jung, Andreas

    2012-09-01

    NISP is the near IR spectrophotometer instrument part of the Cosmic Vision Euclid mission. In this paper we describe an end-to-end simulation scheme developed in the framework of the NISP design study to cover the expected focal-plane on-board pre-processing operations. Non-destructive detector readouts are simulated for a number of different readout strategies, taking into account scientific and calibration observations; resulting frames are passed through a series of steps emulating the foreseen on-board pipeline, then compressed to give the final result. In order to verify final frame quality and resulting computational and memory load, we tested this architecture on a number of hardware platforms similar to those possible for the final NISP computing unit. Here we give the results of the latest tests. This paper mainly reports the technical status at the end of the Definition Phase and it is presented on behalf of the Euclid Consortium.

  12. Application of modified direct denitration to support the ORNL coupled-end-to-end demonstration in production of mixed oxides suitable for pellet fabrication

    SciTech Connect

    Walker, E.A.; Vedder, R.J.; Felker, L.K.; Marschman, S.C.

    2007-07-01

    The current and future development of the Modified Direct Denitration (MDD) process is in support of Oak Ridge National Laboratory's (ORNL) Coupled End-to-End (CETE) research, development, and demonstration (R and D) of proposed advanced fuel reprocessing and fuel fabrication processes. This work will involve the co-conversion of the U/Pu/Np product streams from the UREX+3 separation flow sheet utilizing the existing MDD glove-box setup and the in-cell co-conversion of the U/Pu/Np/Am/Cm product streams from the UREX+1a flow sheet. Characterization equipment is being procured and installed. Oxide powder studies are being done on calcination/reduction variables, as well as pressing and sintering of pellets to permit metallographic examinations. (authors)

  13. The role of environmental controls in determining sardine and anchovy population cycles in the California Current: Analysis of an end-to-end model

    NASA Astrophysics Data System (ADS)

    Fiechter, Jerome; Rose, Kenneth A.; Curchitser, Enrique N.; Hedstrom, Katherine S.

    2015-11-01

    Sardine and anchovy are two forage species of particular interest because of their low-frequency cycles in adult abundance in boundary current regions, combined with a commercially relevant contribution to the global marine food catch. While several hypotheses have been put forth to explain decadal shifts in sardine and anchovy populations, a mechanistic basis for how the physics, biogeochemistry, and biology combine to produce patterns of synchronous variability across widely separated systems has remained elusive. The present study uses a 50-year (1959-2008) simulation of a fully coupled end-to-end ecosystem model configured for sardine and anchovy in the California Current System to investigate how environmental processes control their population dynamics. The results illustrate that slightly different temperature and diet preferences can lead to significantly different responses to environmental variability. Simulated adult population fluctuations are associated with age-1 growth (via age-2 egg production) and prey availability for anchovy, while they depend primarily on age-0 survival and temperature for sardine. The analysis also hints at potential linkages to known modes of climate variability, whereby changes in adult abundance are related to ENSO for anchovy and to the PDO for sardine. The connection to the PDO and ENSO is consistent with modes of interannual and decadal variability that would alternatively favor anchovy during years of cooler temperatures and higher prey availability, and sardine during years of warmer temperatures and lower prey availability. While the end-to-end ecosystem model provides valuable insight on potential relationships between environmental conditions and sardine and anchovy population dynamics, understanding the complex interplay, and potential lags, between the full array of processes controlling their abundances in the California Current System remains an on-going challenge.

  14. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    NASA Astrophysics Data System (ADS)

    Bowen, S. R.; Nyflot, M. J.; Herrmann, C.; Groh, C. M.; Meyer, J.; Wollenweber, S. D.; Stearns, C. W.; Kinahan, P. E.; Sandison, G. A.

    2015-05-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT

  15. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    PubMed Central

    Bowen, S R; Nyflot, M J; Hermann, C; Groh, C; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-01-01

    Effective positron emission tomography/computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by 6 different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy (VMAT) were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses (EUD), and 2%-2mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, < 5% in treatment planning, and < 2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT

  16. Overview of Non-nuclear Testing of the Safe, Affordable 30-kW Fission Engine, Including End-to-End Demonstrator Testing

    NASA Technical Reports Server (NTRS)

    VanDyke, M. K.; Martin, J. J.; Houts, M. G.

    2003-01-01

    Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. At the power levels under consideration (3-300 kW electric power), almost all technical issues are thermal or stress related and will not be strongly affected by the radiation environment. These issues can be resolved more thoroughly, less expensively, and in a more timely fashing with nonnuclear testing, provided it is prototypic of the system in question. This approach was used for the safe, affordable fission engine test article development program and accomplished viz cooperative efforts with Department of Energy labs, industry, universiites, and other NASA centers. This Technical Memorandum covers the analysis, testing, and data reduction of a 30-kW simulated reactor as well as an end-to-end demonstrator, including a power conversion system and an electric propulsion engine, the first of its kind in the United States.

  17. Performances of the fractal iterative method with an internal model control law on the ESO end-to-end ELT adaptive optics simulator

    NASA Astrophysics Data System (ADS)

    Béchet, C.; Le Louarn, M.; Tallon, M.; Thiébaut, É.

    2008-07-01

    Adaptive Optics systems under study for the Extremely Large Telescopes gave rise to a new generation of algorithms for both wavefront reconstruction and the control law. In the first place, the large number of controlled actuators impose the use of computationally efficient methods. Secondly, the performance criterion is no longer solely based on nulling residual measurements. Priors on turbulence must be inserted. In order to satisfy these two requirements, we suggested to associate the Fractal Iterative Method for the estimation step with an Internal Model Control. This combination has now been tested on an end-to-end adaptive optics numerical simulator at ESO, named Octopus. Results are presented here and performance of our method is compared to the classical Matrix-Vector Multiplication combined with a pure integrator. In the light of a theoretical analysis of our control algorithm, we investigate the influence of several errors contributions on our simulations. The reconstruction error varies with the signal-to-noise ratio but is limited by the use of priors. The ratio between the system loop delay and the wavefront coherence time also impacts on the reachable Strehl ratio. Whereas no instabilities are observed, correction quality is obviously affected at low flux, when subapertures extinctions are frequent. Last but not least, the simulations have demonstrated the robustness of the method with respect to sensor modeling errors and actuators misalignments.

  18. Ecosystem limits to food web fluxes and fisheries yields in the North Sea simulated with an end-to-end food web model

    NASA Astrophysics Data System (ADS)

    Heath, Michael R.

    2012-09-01

    Equilibrium yields from an exploited fish stock represent the surplus production remaining after accounting for losses due to predation. However, most estimates of maximum sustainable yield, upon which fisheries management targets are partly based, assume that productivity and predation rates are constant in time or at least stationary. This means that there is no recognition of the potential for interaction between different fishing sectors. Here, an end-to-end ecosystem model is developed to explore the possible scale and mechanisms of interactions between pelagic and demersal fishing in the North Sea. The model simulates fluxes of nitrogen between detritus, inorganic nutrient and guilds of taxa spanning phytoplankton to mammals. The structure strikes a balance between graininess in space, taxonomy and demography, and the need to constrain the parameter-count sufficiently to enable automatic parameter optimization. Simulated annealing is used to locate the maximum likelihood parameter set, given the model structure and a suite of observations of annual rates of production and fluxes between guilds. Simulations of the impact of fishery harvesting rates showed that equilibrium yields of pelagic and demersal fish were strongly interrelated due to a variety of top-down and bottom-up food web interactions. The results clearly show that management goals based on simultaneously achieving maximum sustainable biomass yields from all commercial fish stocks is simply unattainable. Trade-offs between, for example, pelagic and demersal fishery sectors and other properties of the ecosystem have to be considered in devising an overall harvesting strategy.

  19. AFCI Coupled End-to-End Research,Development and Demonstration Project: Integrated Off-gas Treatment System Design and Initial Performance - 9226

    SciTech Connect

    Jubin, Robert Thomas; Patton, Bradley D; Ramey, Dan W; Spencer, Barry B

    2009-01-01

    Oak Ridge National Laboratory is conducting a complete, coupled end-to-end (CETE) demonstration of advanced nuclear fuel reprocessing to support the Advanced Fuel Cycle Initiative. This small-scale reprocessing operation provides a unique opportunity to test integrated off-gas treatment systems designed to recover the primary volatile fission and activation products (H-3, C-14, Kr-85, and I-139) released from the spent nuclear fuel (SNF). The CETE project will demonstrate an advanced head-end process, referred to as voloxidation, designed to condition the SNF, separate the SNF from the cladding, and release tritium contained in the fuel matrix. The off-gas from the dry voloxidation process as well as from the more traditional fuel dissolution process will be treated separately and the volatile components recovered. This paper provides descriptions of the off-gas treatment systems for both the voloxidation process and for the fuel dissolution process and provides preliminary results from the initial CETE processing runs. Impacts of processing parameters on the relative quantities of volatile components released and recovery efficiencies are evaluated.

  20. Morphological study of the healing process after diode laser-assisted end-to-end microanastomosis: comparison with conventional manual suture

    NASA Astrophysics Data System (ADS)

    Tang, Jing; Rouy, Simone; Prudhomme, Michel; Godlewski, Guilhem; Chambettaz, Francois; Delacretaz, Guy P.; Salathe, Rene-Paul

    1996-01-01

    A series of carotid end-to-end diode laser assisted microvascular anastomosis (LAMA) versus control ateral conventional suture microanastomosis (CMA) were performed in 120 Wistar rats (in the same animal, LAMA performed in the left side and CMA in the right). The optic and scanning electron microscopic examinations were assessed from day 0 to day 210. The results revealed that on day 0 LAMA gave rise to proteins deneturation and collagens fusion of the media and adventitia in the arterial wall. Re-endothelialization of anastomotic line began at day 3, as well as a large number of inflammatory aggregated in the adventitia. On day 10 the endothelial cells were restored on the anastomotic site and collagenous network developed in the media. On day 90 proliferation and disorientation of the elastic fibers appeared. A part of elastic laminae had been reconstructed on day 210. In the group CMA, the re- endothelialization developed later than LAMA, and the reconstruction of the elastic laminae failed to happen until day 210. These data suggest that the results of long term healing process after diode LAMA is better than that of CMA in normal artery repair.

  1. Land Mobile Satellite Service (LMSS) channel simulator: An end-to-end hardware simulation and study of the LMSS communications links

    NASA Technical Reports Server (NTRS)

    Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.

    1984-01-01

    The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.

  2. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  3. Design of a satellite end-to-end mission performance simulator for imaging spectrometers and its application to the ESA's FLEX/Sentinel-3 tandem mission

    NASA Astrophysics Data System (ADS)

    Vicent, Jorge; Sabater, Neus; Tenjo, Carolina; Acarreta, Juan R.; Manzano, María.; Rivera, Juan P.; Jurado, Pedro; Franco, Raffaella; Alonso, Luis; Moreno, Jose

    2015-09-01

    The performance analysis of a satellite mission requires specific tools that can simulate the behavior of the platform; its payload; and the acquisition of scientific data from synthetic scenes. These software tools, called End-to-End Mission Performance Simulators (E2ES), are promoted by the European Space Agency (ESA) with the goal of consolidating the instrument and mission requirements as well as optimizing the implemented data processing algorithms. Nevertheless, most developed E2ES are designed for a specific satellite mission and can hardly be adapted to other satellite missions. In the frame of ESA's FLEX mission activities, an E2ES is being developed based on a generic architecture for passive optical missions. FLEX E2ES implements a state-of-the-art synthetic scene generator that is coupled with dedicated algorithms that model the platform and instrument characteristics. This work will describe the flexibility of the FLEX E2ES to simulate complex synthetic scenes with a variety of land cover classes, topography and cloud cover that are observed separately by each instrument (FLORIS, OLCI and SLSTR). The implemented algorithms allows modelling the sensor behavior, i.e. the spectral/spatial resampling of the input scene; the geometry of acquisition; the sensor noises and non-uniformity effects (e.g. stray-light, spectral smile and radiometric noise); and the full retrieval scheme up to Level-2 products. It is expected that the design methodology implemented in FLEX E2ES can be used as baseline for other imaging spectrometer missions and will be further expanded towards a generic E2ES software tool.

  4. Assessing the value of seasonal climate forecast information through an end-to-end forecasting framework: Application to U.S. 2012 drought in central Illinois

    NASA Astrophysics Data System (ADS)

    Shafiee-Jood, Majid; Cai, Ximing; Chen, Ligang; Liang, Xin-Zhong; Kumar, Praveen

    2014-08-01

    This study proposes an end-to-end forecasting framework to incorporate operational seasonal climate forecasts to help farmers improve their decisions prior to the crop growth season, which are vulnerable to unanticipated drought conditions. The framework couples a crop growth model with a decision-making model for rainfed agriculture and translates probabilistic seasonal forecasts into more user-related information that can be used to support farmers' decisions on crop type and some market choices (e.g., contracts with ethanol refinery). The regional Climate-Weather Research and Forecasting model (CWRF) driven by two operational general circulation models (GCMs) is used to provide the seasonal forecasts of weather parameters. To better assess the developed framework, CWRF is also driven by observational reanalysis data, which theoretically can be considered as the best seasonal forecast. The proposed framework is applied to the Salt Creek watershed in Illinois that experienced an extreme drought event during 2012 crop growth season. The results show that the forecasts cannot capture the 2012 drought condition in Salt Creek and therefore the suggested decisions can make farmers worse off if the suggestions are adopted. Alternatively, the optimal decisions based on reanalysis-based CWRF forecasts, which can capture the 2012 drought conditions, make farmers better off by suggesting "no-contract" with ethanol refineries. This study suggests that the conventional metric used for ex ante value assessment is not capable of providing meaningful information in the case of extreme drought. Also, it is observed that institutional interventions (e.g., crop insurance) highly influences farmers' decisions and, thereby, the assessment of forecast value.

  5. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    NASA Astrophysics Data System (ADS)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  6. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    PubMed

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W; Ramey, John; Davis, Mark M; Kalams, Spyros A; De Rosa, Stephen C; Gottardo, Raphael

    2014-08-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational

  7. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  8. A novel PON based UMTS broadband wireless access network architecture with an algorithm to guarantee end to end QoS

    NASA Astrophysics Data System (ADS)

    Sana, Ajaz; Hussain, Shahab; Ali, Mohammed A.; Ahmed, Samir

    2007-09-01

    In this paper we proposes a novel Passive Optical Network (PON) based broadband wireless access network architecture to provide multimedia services (video telephony, video streaming, mobile TV, mobile emails etc) to mobile users. In the conventional wireless access networks, the base stations (Node B) and Radio Network Controllers (RNC) are connected by point to point T1/E1 lines (Iub interface). The T1/E1 lines are expensive and add up to operating costs. Also the resources (transceivers and T1/E1) are designed for peak hours traffic, so most of the time the dedicated resources are idle and wasted. Further more the T1/E1 lines are not capable of supporting bandwidth (BW) required by next generation wireless multimedia services proposed by High Speed Packet Access (HSPA, Rel.5) for Universal Mobile Telecommunications System (UMTS) and Evolution Data only (EV-DO) for Code Division Multiple Access 2000 (CDMA2000). The proposed PON based back haul can provide Giga bit data rates and Iub interface can be dynamically shared by Node Bs. The BW is dynamically allocated and the unused BW from lightly loaded Node Bs is assigned to heavily loaded Node Bs. We also propose a novel algorithm to provide end to end Quality of Service (QoS) (between RNC and user equipment).The algorithm provides QoS bounds in the wired domain as well as in wireless domain with compensation for wireless link errors. Because of the air interface there can be certain times when the user equipment (UE) is unable to communicate with Node B (usually referred to as link error). Since the link errors are bursty and location dependent. For a proposed approach, the scheduler at the Node B maps priorities and weights for QoS into wireless MAC. The compensations for errored links is provided by the swapping of services between the active users and the user data is divided into flows, with flows allowed to lag or lead. The algorithm guarantees (1)delay and throughput for error-free flows,(2)short term fairness

  9. OpenCyto: An Open Source Infrastructure for Scalable, Robust, Reproducible, and Automated, End-to-End Flow Cytometry Data Analysis

    PubMed Central

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W.; Ramey, John; Davis, Mark M.; Kalams, Spyros A.; De Rosa, Stephen C.; Gottardo, Raphael

    2014-01-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational

  10. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    SciTech Connect

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  11. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-07-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is currently widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes, and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionospheric error (RIE) can still be significant so that it needs to be further mitigated for high-accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important for enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and standard deviations, with solar activity, latitudinal region and with or without the assumption of ionospheric spherical symmetry and co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low to high latitudes, have a clear negative tendency and a magnitude increasing with solar activity, which is in line with recent empirical studies based on real RO data although we find smaller bias magnitudes, deserving further study in the future. The maximum RIE biases are found at low latitudes during daytime, where they amount to within -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions

  12. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-01-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is meanwhile widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionopheric error (RIE) can still be significant so that it needs to be further mitigated for high accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important towards enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and SDs, with solar activity, latitudinal region, and with or without the assumption of ionospheric spherical symmetry and of co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low- to high-latitudes, have a clear negative tendency and a magnitude increasing with solar activity, in line with recent empirical studies based on real RO data. The maximum RIE biases are found at low latitudes during daytime, where they amount to with in -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions). Ionospheric spherical symmetry or asymmetries about the RO event location have only a minor influence on

  13. SU-E-J-194: Dynamic Tumor Tracking End-To-End Testing Using a 4D Thorax Phantom and EBT3 Films

    SciTech Connect

    Su, Z; Wu, J; Li, Z; Mamalui-Hunter, M

    2014-06-01

    Purpose: To quantify the Vero linac dosimetric accuracy of the tumor dynamic tracking treatment using EBT3 film embedded in a 4D thorax phantom. Methods: A dynamic thorax phantom with tissue equivalent materials and a film insert were used in this study. The thorax phantom was scanned in 4DCT mode with a viscoil embedded in its film insert composed of lung equivalent material. Dynamic tracking planning was performed using the 50% phase CT set with 5 conformal beams at gantry angles of 330, 15, 60, 105 and 150 degrees. Each field is a 3cm by 3cm square centered at viscoil since there was no solid mass target. Total 3 different 1–2cos4 motion profiles were used with varied motion magnitude and cycle frequency. Before treatment plan irradiation, a 4D motion model of the target was established using a series of acquired fluoroscopic images and infrared markers motion positions. During irradiation, fluoroscopic image monitoring viscoil motion was performed to verify model validity. The irradiated films were scanned and the dose maps were compared to the planned Monte Carlo dose distributions. Gamma analyses using 3%–3mm, 2%–3mm, 3%–2mm, 2%–2mm criteria were performed and presented. Results: For each motion pattern, a 4D motion model was built successfully and the target tracking performance was verified with fluoroscopic monitoring of the viscoil motion and its model predicted locations. The film gamma analysis showed the average pass rates among the 3 motion profiles are 98.14%, 96.2%, 91.3% and 85.61% for 3%–3mm, 2%–3mm, 3%–2mm, 2%–2mm criteria. Conclusion: Target dynamic tracking was performed using patient-like breathing patterns in a 4D thorax phantom with EBT3 film insert and a viscoil. There was excellent agreement between acquired and planned dose distributions for all three target motion patterns. This study performed end-to-end testing and verified the treatment accuracy of tumor dynamic tracking.

  14. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  15. Micro-ARES, an electric-field sensor for ExoMars 2016: Electric fields modelling, sensitivity evaluations and end-to-end tests.

    NASA Astrophysics Data System (ADS)

    Déprez, Grégoire; Montmessin, Franck; Witasse, Olivier; Lapauw, Laurent; Vivat, Francis; Abbaki, Sadok; Granier, Philippe; Moirin, David; Trautner, Roland; Hassen-Khodja, Rafik; d'Almeida, Éric; Chardenal, Laurent; Berthelier, Jean-Jacques; Esposito, Francesca; Debei, Stefano; Rafkin, Scott; Barth, Erika

    2014-05-01

    Earth and transposed to the Martian atmospheric parameters. Knowing the expected electric fields and simulating them, the next step in order to evaluate the performance of the instrument is to determine its sensitivity by modelling the response of the instrument. The last step is to confront the model of the instrument, and the expected results for a given signal with the effective outputs of the electric board with the same signal as an input. To achieve this end-to-end test, we use a signal generator followed by an electrical circuit reproducing the electrode behaviour in the Martian environment, in order to inject a realistic electric signal in the processing board and finally compare the produced formatted data with the expected ones.

  16. Planning for Mars Sample Return: Results from the MEPAG Mars Sample Return End-to-End International Science Analysis Group (E2E-iSAG)

    NASA Astrophysics Data System (ADS)

    McLennan, S. M.; Sephton, M.; Mepag E2E-Isag

    2011-12-01

    The National Research Council 2011 Planetary Decadal Survey (2013-2022) placed beginning a Mars sample return campaign (MSR) as the top priority for large Flagship missions in the coming decade. Recent developments in NASA-ESA collaborations and Decadal Survey recommendations indicate MSR likely will be an international effort. A joint ESA-NASA 2018 rover (combining the previously proposed ExoMars and MAX-C missions), designed, in part, to collect and cache samples, would thus represent the first of a 3-mission MSR campaign. The End-to-End International Science Analysis Group (E2E-iSAG) was chartered by MEPAG in August 2010 to develop and prioritize MSR science objectives and investigate implications of these objectives for defining the highest priority sample types, landing site selection criteria (and identification of reference landing sites to support engineering planning), requirements for in situ characterization on Mars to support sample selection, and priorities/strategies for returned sample analyses to determine sample sizes and numbers that would meet the objectives. MEPAG approved the E2E-iSAG report in June 2011. Science objectives, summarized in priority order, are: (1) critically assess any evidence for past life or its chemical precursors, and place constraints on past habitability and potential for preservation of signs of life, (2) quantitatively constrain age, context and processes of accretion, early differentiation and magmatic and magnetic history, (3) reconstruct history of surface and near-surface processes involving water, (4) constrain magnitude, nature, timing, and origin of past climate change, (5) assess potential environmental hazards to future human exploration, (6) assess history and significance of surface modifying processes, (7) constrain origin and evolution of the Martian atmosphere, (8) evaluate potential critical resources for future human explorers. All returned samples also would be fully evaluated for extant life as a

  17. Outage management and health physics issue, 2009

    SciTech Connect

    Agnihotri, Newal

    2009-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles include the following: Planning and scheduling to minimize refueling outage, by Pat McKenna, AmerenUE; Prioritizing safety, quality and schedule, by Tom Sharkey, Dominion; Benchmarking to high standards, by Margie Jepson, Energy Nuclear; Benchmarking against U.S. standards, by Magnox North, United Kingdom; Enabling suppliers for new build activity, by Marcus Harrington, GE Hitachi Nuclear Energy; Identifying, cultivating and qualifying suppliers, by Thomas E. Silva, AREVA NP; Creating new U.S. jobs, by Francois Martineau, Areva NP. Industry innovation articles include: MSL Acoustic source load reduction, by Amir Shahkarami, Exelon Nuclear; Dual Methodology NDE of CRDM nozzles, by Michael Stark, Dominion Nuclear; and Electronic circuit board testing, by James Amundsen, FirstEnergy Nuclear Operating Company. The plant profile article is titled The future is now, by Julia Milstead, Progress Energy Service Company, LLC.

  18. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  19. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  20. SU-E-T-109: Development of An End-To-End Test for the Varian TrueBeamtm with a Novel Multiple-Dosimetric Modality H and N Phantom

    SciTech Connect

    Zakjevskii, V; Knill, C; Rakowski, J; Snyder, M

    2014-06-01

    Purpose: To develop a comprehensive end-to-end test for Varian's TrueBeam linear accelerator for head and neck IMRT using a custom phantom designed to utilize multiple dosimetry devices. Methods: The initial end-to-end test and custom H and N phantom were designed to yield maximum information in anatomical regions significant to H and N plans with respect to: i) geometric accuracy, ii) dosimetric accuracy, and iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. A CT image was taken with a 1mm slice thickness. The CT was imported into Varian's Eclipse treatment planning system, where OARs and the PTV were contoured. A clinical template was used to create an eight field static gantry angle IMRT plan. After optimization, dose was calculated using the Analytic Anisotropic Algorithm with inhomogeneity correction. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film and ion chambers. Ion chamber dose measurements were compared to the TPS. Films were analyzed with FilmQAPro using composite gamma index. Results: Film analysis for the initial end-to-end plan with a geometrically simple PTV showed average gamma pass rates >99% with a passing criterion of 3% / 3mm. Film analysis of a plan with a more realistic, ie. complex, PTV yielded pass rates >99% in clinically important regions containing the PTV, spinal cord and parotid glands. Ion chamber measurements were on average within 1.21% of calculated dose for both plans. Conclusion: trials have demonstrated that our end-to-end testing methods provide baseline values for the dosimetric and geometric accuracy of Varian's TrueBeam system.

  1. Design and implementation of a secure and user-friendly broker platform supporting the end-to-end provisioning of e-homecare services.

    PubMed

    Van Hoecke, Sofie; Steurbaut, Kristof; Taveirne, Kristof; De Turck, Filip; Dhoedt, Bart

    2010-01-01

    We designed a broker platform for e-homecare services using web service technology. The broker allows efficient data communication and guarantees quality requirements such as security, availability and cost-efficiency by dynamic selection of services, minimizing user interactions and simplifying authentication through a single user sign-on. A prototype was implemented, with several e-homecare services (alarm, telemonitoring, audio diary and video-chat). It was evaluated by patients with diabetes and multiple sclerosis. The patients found that the start-up time and overhead imposed by the platform was satisfactory. Having all e-homecare services integrated into a single application, which required only one login, resulted in a high quality of experience for the patients. PMID:20086267

  2. [A case of tracheo-bronchial stenosis after extended end-to-end aortic arch anastomosis for interrupted aortic arch treated with suspension of the ascending artery and pulmonary artery].

    PubMed

    Watanabe, T; Hoshino, S; Iwaya, F; Igari, T; Ono, T; Takahashi, K

    2001-02-01

    A 9-day-old boy had pulmonary artery banding and extended end-to-end aortic arch anastomosis for ventricular septal defect (VSD) and type A interrupted aortic arch. Severe dyspnea gradually developed. At 3 months of age, intracardiac repair of VSD was performed. Weaning from the ventilator was difficult. Endoscopic examination and chest CT revealed stenosis of the right and left main bronchi and compression of tracheal bifurcation and the right and left main bronchi by the ascending aorta and pulmonary artery. Suspension of the ascending aorta and pulmonary artery was performed 15 days after VSD closure. Nine days after this procedure, the patient was weaned from respirator. Postoperative course was uneventful. Bronchial stenosis may be caused from extended end-to-end aortic arch anastomosis. PMID:11211771

  3. Outage management: A case study

    SciTech Connect

    Haber, S.B.; Barriere, M.T. ); Roberts, K.H. . Walter A. Haas School of Business)

    1992-01-01

    Outage management issues identified from a field study conducted at a two-unit commercial pressurized water reactor (PWR), when one unit was in a refueling outage and the other unit was at full power operation, are the focus of this paper. The study was conduced as part of the US Nuclear Regulatory Commission's (NRC) organizational factors research program, and therefore the issues to be addressed are from an organizational perspective. Topics discussed refer to areas identified by the NRC as critical for safety during shutdown operations, including outage planning and control, personnel stress, and improvements in training and procedures. Specifically, issues in communication, management attention, involvement and oversight, administrative processes, organizational culture, and human resources relevant to each of the areas are highlighted by example from field data collection. Insights regarding future guidance in these areas are presented based upon additional data collection subsequent to the original study.

  4. Outage management: A case study

    SciTech Connect

    Haber, S.B.; Barriere, M.T.; Roberts, K.H.

    1992-09-01

    Outage management issues identified from a field study conducted at a two-unit commercial pressurized water reactor (PWR), when one unit was in a refueling outage and the other unit was at full power operation, are the focus of this paper. The study was conduced as part of the US Nuclear Regulatory Commission`s (NRC) organizational factors research program, and therefore the issues to be addressed are from an organizational perspective. Topics discussed refer to areas identified by the NRC as critical for safety during shutdown operations, including outage planning and control, personnel stress, and improvements in training and procedures. Specifically, issues in communication, management attention, involvement and oversight, administrative processes, organizational culture, and human resources relevant to each of the areas are highlighted by example from field data collection. Insights regarding future guidance in these areas are presented based upon additional data collection subsequent to the original study.

  5. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    NASA Astrophysics Data System (ADS)

    Sumner, Matthew Casey

    signal and broader tuning range of the Gunn continue to make it the preferred choice. The receiver and high-resolution spectrometer system were brought into a fully operational state late in 2007, when they were used to perform unbiased molecular-line surveys of several galactic sources, including the Orion KL hot core and a position in the L1157 outflow. In order to analyze these data, a new data pipeline was needed to deconvolve the double-sideband signals from the receiver and to model the molecular spectra. A highly automated sideband-deconvolution system has been created, and spectral-analysis tools are currently being developed. The sideband deconvolution relies on chi-square minimization to determine the optimal single-sideband spectrum in the presence of unknown sideband-gain imbalances and spectral baselines. Analytic results are presented for several different methods of approaching the problem, including direct optimization, nonlinear root finding, and a hybrid approach that utilizes a two-stage process to separate out the relatively weak nonlinearities so that the majority of the parameters can be found with a fast linear solver. Analytic derivations of the Jacobian matrices for all three cases are presented, along with a new Mathematica utility that enables the calculation of arbitrary gradients. The direct-optimization method has been incorporated into software, along with a spectral simulation engine that allows different deconvolution scenarios to be tested. The software has been validated through the deconvolution of simulated data sets, and initial results from L1157 and Orion are presented. Both surveys demonstrate the power of the wideband receivers and improved data pipeline to enable exciting scientific studies. The L1157 survey was completed in only 20 hours of telescope time and offers moderate sensitivity over a > 50-GHz range, from 220 GHz to approximately 270 or 280 GHz. The speed with which this survey was completed implies that the new

  6. FPL's Christmas 1991 transmission outages

    SciTech Connect

    Burnham, J.T.; Busch, D.W.; Renowden, J.D. . Transmission Line Dept.)

    1993-10-01

    A record number of contamination related outages occurred on FPL transmission lines during Christmas of 1991 and resulted in an investigation of inservice insulator performance. The field investigation process used was enhanced by recent improvements in outage data recording. Also used in the analysis were weather information, the results of recently completed accelerated aging tests of polymers, and specially conducted tests on the effects of weathering steel stain on porcelain insulators. Specific insulator problems were identified and actions taken to reduce the possibility of recurrence.

  7. Demonstration of end-to-end cloud-DSL with a PON-based fronthaul supporting 5.76-Gb/s throughput with 48 eCDMA-encoded 1024-QAM discrete multi-tone signals.

    PubMed

    Fang, Liming; Zhou, Lei; Liu, Xiang; Zhang, Xiaofeng; Sui, Meng; Effenberger, Frank; Zhou, Jun

    2015-05-18

    We experimentally demonstrate an end-to-end ultra-broadband cloud-DSL network using passive optical network (PON) based fronthaul with electronic code-division-multiple-access (eCDMA) encoding and decoding. Forty-eight signals that are compliant with the very-high-bit-rate digital subscriber line 2 (VDSL2) standard are transmitted with a record throughput of 5.76 Gb/s over a hybrid link consisting of a 20-km standard single-mode fiber and a 100-m twisted pair. PMID:26074597

  8. CASTOR end-to-end monitoring

    NASA Astrophysics Data System (ADS)

    Rekatsinas, Theodoros; Duellmann, Dirk; Pokorski, Witold; Ponce, Sébastien; Rabaçal, Bartolomeu; Waldron, Dennis; Wojcieszuk, Jacek

    2010-04-01

    With the start of Large Hadron Collider approaching, storage and management of raw event data, as well as reconstruction and analysis data, is of crucial importance for the researchers. The CERN Advanced STORage system (CASTOR) is a hierarchical system developed at CERN, used to store physics production files and user files. CASTOR, as one of the essential software tools used by the LHC experiments, has to provide reliable services for storing and managing data. Monitoring of this complicated system is mandatory in order to assure its stable operation and improve its future performance. This paper presents the new monitoring system of CASTOR which provides operation and user request specific metrics. This system is build around a dedicated, optimized database schema. The schema is populated by PL/SQL procedures, which process a stream of incoming raw metadata from different CASTOR components, initially collected by the Distributed Logging Facility (DLF). A web interface has been developed for the visualization of the monitoring data. The different histograms and plots are created using PHP scripts which query the monitoring database.

  9. End-to-end image quality assessment

    NASA Astrophysics Data System (ADS)

    Raventos, Joaquin

    2012-05-01

    An innovative computerized benchmarking approach (US Patent pending Sep 2011) based on extensive application of photometry, geometrical optics, and digital media using a randomized target, for a standard observer to assess the image quality of video imaging systems, at different day time, and low-light luminance levels. It takes into account, the target's contrast and color characteristics, as well as the observer's visual acuity and dynamic response. This includes human vision as part of the "extended video imaging system" (EVIS), and allows image quality assessment by several standard observers simultaneously.

  10. Pilot End-to-End Calibration Results

    NASA Astrophysics Data System (ADS)

    Misawa, R.; Bernard, J.-Ph.; Ade, P.; Andre, Y.; de Bernardis, P.; Bautista, L.; Boulade, O.; Bousquet, F.; Bouzit, M.; Bray, N.; Brysbaert, C.; Buttice, V.; Caillat, A.; Chaigneau, M.; Charra, M.; Crane, B.; Douchin, F.; Doumayrou, E.; Dubois, J. P.; Engel, C.; Etcheto, P.; Evrard, J.; Gelot, P.; Gomes, A.; Grabarnik, S.; Griffin, M.; Hargrave, P.; Jonathan, A.; Laureijs, R.; Laurens, A.; Lepennec, Y.; Leriche, B.; Longval, Y.; Martignac, J.; Marty, C.; Marty, W.; Maestre, S.; Masi, S.; Mirc, F.; Montel, J.; Motier, L.; Mot, B.; Narbonne, J.; Nicot, J. M.; Otrio, G.; Pajot, F.; Perot, E.; Pisano, G.; Ponthieu, N.; Ristorcelli, I.; Rodriquez, L.; Roudil, G.; Saccoccio, M.; Salatino, M.; Savini, G.; Simonella, O.; Tauber, J.; Tapie, P.; Tucker, C.; Versepuech, G.

    2015-09-01

    The Polarized Instrument for Long-wavelength Observation of the Tenuous interstellar medium (PILOT) is a balloon-borne astronomy experiment designed to study the linear polarization of the Far Infra-Red emission, 240 ~im (1.2 THz) and 550 ~tm (545 GHz) with an angular resolution of a few minutes of arc, from dust grains present in the diffuse interstellar medium, in our Galaxy and nearby galaxies. The polarisation of light is measured using a half-wave plate (HWP). We performed the instrumental tests from 2012 to 2014 and are planning a first scientific flight in September 2015 from Timmins, Ontario, Canada. This paper describes the measurement principles of PILOT, the results of the laboratory tests and its sky coverage. These include defocus tests, transmission measurements using a Fourier Transform Spectrometer at various positions of the HWP, and identification of internal straylight.

  11. A comparative study of red and blue light-emitting diodes and low-level laser in regeneration of the transected sciatic nerve after an end to end neurorrhaphy in rabbits.

    PubMed

    Takhtfooladi, Mohammad Ashrafzadeh; Sharifi, Davood

    2015-12-01

    This study aimed at evaluating the effects of red and blue light-emitting diodes (LED) and low-level laser (LLL) on the regeneration of the transected sciatic nerve after an end-to-end neurorrhaphy in rabbits. Forty healthy mature male New Zealand rabbits were randomly assigned into four experimental groups: control, LLL (680 nm), red LED (650 nm), and blue LED (450 nm). All animals underwent the right sciatic nerve neurotmesis injury under general anesthesia and end-to-end anastomosis. The phototherapy was initiated on the first postoperative day and lasted for 14 consecutive days at the same time of the day. On the 30th day post-surgery, the animals whose sciatic nerves were harvested for histopathological analysis were euthanized. The nerves were analyzed and quantified the following findings: Schwann cells, large myelinic axons, and neurons. In the LLL group, as compared to other groups, an increase in the number of all analyzed aspects was observed with significance level (P < 0.05). This finding suggests that postoperative LLL irradiation was able to accelerate and potentialize the peripheral nerve regeneration process in rabbits within 14 days of irradiation. PMID:26415928

  12. Development of Methodologies for Technology Deployment for Advanced Outage Control Centers that Improve Outage Coordination, Problem Resolution and Outage Risk Management

    SciTech Connect

    Shawn St. Germain; Ronald Farris; Heather Medeman

    2013-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The long term viability of existing nuclear power plants in the U.S. will depend upon maintaining high capacity factors, avoiding nuclear safety issues and reducing operating costs. The slow progress in the construction on new nuclear power plants has placed in increased importance on maintaining the output of the current fleet of nuclear power plants. Recently expanded natural gas production has placed increased economic pressure on nuclear power plants due to lower cost competition. Until recently, power uprate projects had steadily increased the total output of the U.S. nuclear fleet. Errors made during power plant upgrade projects have now removed three nuclear power plants from the U.S. fleet and economic considerations have caused the permanent shutdown of a fourth plant. Additionally, several utilities have cancelled power uprate projects citing economic concerns. For the past several years net electrical generation from U.S. nuclear power plants has been declining. One of few remaining areas where significant improvements in plant capacity factors can be made is in minimizing the duration of refueling outages. Managing nuclear power plant outages is a complex and difficult task. Due to the large number of complex tasks and the uncertainty that accompanies them, outage durations routinely exceed the planned duration. The ability to complete an outage on or near

  13. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    PubMed Central

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients. PMID:27276395

  14. Experimental demonstration of a record high 11.25Gb/s real-time optical OFDM transceiver supporting 25km SMF end-to-end transmission in simple IMDD systems.

    PubMed

    Giddings, R P; Jin, X Q; Hugues-Salas, E; Giacoumidis, E; Wei, J L; Tang, J M

    2010-03-15

    The fastest ever 11.25Gb/s real-time FPGA-based optical orthogonal frequency division multiplexing (OOFDM) transceivers utilizing 64-QAM encoding/decoding and significantly improved variable power loading are experimentally demonstrated, for the first time, incorporating advanced functionalities of on-line performance monitoring, live system parameter optimization and channel estimation. Real-time end-to-end transmission of an 11.25Gb/s 64-QAM-encoded OOFDM signal with a high electrical spectral efficiency of 5.625bit/s/Hz over 25km of standard and MetroCor single-mode fibres is successfully achieved with respective power penalties of 0.3dB and -0.2dB at a BER of 1.0 x 10(-3) in a directly modulated DFB laser-based intensity modulation and direct detection system without in-line optical amplification and chromatic dispersion compensation. The impacts of variable power loading as well as electrical and optical components on the transmission performance of the demonstrated transceivers are experimentally explored in detail. In addition, numerical simulations also show that variable power loading is an extremely effective means of escalating system performance to its maximum potential. PMID:20389570

  15. GUIDELINES FOR IMPLEMENTATION OF AN ADVANCED OUTAGE CONTROL CENTER TO IMPROVE OUTAGE COORDINATION, PROBLEM RESOLUTION, AND OUTAGE RISK MANAGEMENT

    SciTech Connect

    Germain, Shawn St; Farris, Ronald; Whaley, April M; Medema, Heather; Gertman, David

    2014-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Managing NPP outages is a complex and difficult task due to the large number of maintenance and repair activities that are accomplished in a short period of time. During an outage, the outage control center (OCC) is the temporary command center for outage managers and provides several critical functions for the successful execution of the outage schedule. Essentially, the OCC functions to facilitate information inflow, assist outage management in processing information, and to facilitate the dissemination of information to stakeholders. Currently, outage management activities primarily rely on telephone communication, face to face reports of status, and periodic briefings in the OCC. It is a difficult task to maintain current the information related to outage progress and discovered conditions. Several advanced communication and collaboration technologies have shown promise for facilitating the information flow into, across, and out of the OCC. The use of these technologies will allow information to be shared electronically, providing greater amounts of real-time information to the decision makers and allowing OCC coordinators to meet with supporting staff remotely. Passively monitoring status electronically through advances in the areas of mobile worker technologies, computer-based procedures, and automated work packages will reduce the current reliance on manually

  16. Experimental demonstrations of record high REAM intensity modulator-enabled 19.25Gb/s real-time end-to-end dual-band optical OFDM colorless transmissions over 25km SSMF IMDD systems.

    PubMed

    Zhang, Q W; Hugues-Salas, E; Giddings, R P; Wang, M; Tang, J M

    2013-04-01

    Record-high 19.25Gb/s real-time end-to-end dual-band optical OFDM (OOFDM) colorless transmissions across the entire C-band are experimentally demonstrated, for the first time, in reflective electro-absorption modulator (REAM)-based 25km standard SMF systems using intensity modulation and direct detection. Adaptively modulated baseband (0-2GHz) and passband (6.125 ± 2GHz) OFDM RF sub-bands, supporting signal line rates of 9.75Gb/s and 9.5Gb/s respectively, are independently generated and detected with FPGA-based DSP clocked at only 100MHz as well as DACs/ADCs operating at sampling speeds as low as 4GS/s. The two OFDM sub-bands are electrically multiplexed for intensity modulation of a single optical carrier by an 8GHz REAM. The REAM colorlessness is experimentally characterized, based on which optimum REAM operating conditions are identified. To maximize and balance the signal transmission performance of each sub-band, on-line adaptive transceiver optimization functions and live performance monitoring are fully exploited to optimize key OOFDM transceiver and system parameters. For different wavelengths within the C-band, corresponding minimum received optical powers at the FEC limit vary in a range of <0.5dB and bit error rate performances for both baseband and passband signals are almost identical. Furthermore, detailed investigations are also undertaken of the maximum aggregated signal line rate sensitivity to electrical sub-band power variation. It is shown that the aforementioned system has approximately 3dB tolerance to RF sub-band power variation. PMID:23572005

  17. Managing turbine-generator outages by computer

    SciTech Connect

    Reinhart, E.R.

    1997-09-01

    This article describes software being developed to address the need for computerized planning and documentation programs that can help manage outages. Downsized power-utility companies and the growing demand for independent, competitive engineering and maintenance services have created a need for a computer-assisted planning and technical-direction program for turbine-generator outages. To meet this need, a software tool is now under development that can run on a desktop or laptop personal computer to assist utility personnel and technical directors in outage planning. Total Outage Planning Software (TOPS), which runs on Windows, takes advantage of the mass data storage available with compact-disc technology by archiving the complete outage documentation on CD. Previous outage records can then be indexed, searched, and viewed on a computer with the click of a mouse. Critical-path schedules, parts lists, parts order tracking, work instructions and procedures, custom data sheets, and progress reports can be generated by computer on-site during an outage.

  18. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    NASA Astrophysics Data System (ADS)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data

  19. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    SciTech Connect

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak shear

  20. The Dosimetric Importance of Six Degree of Freedom Couch End to End Quality Assurance for SRS/SBRT Treatments when Comparing Intensity Modulated Radiation Therapy to Volumetric Modulated Arc Therapy

    NASA Astrophysics Data System (ADS)

    Ulizio, Vincent Michael

    With the advancement of technology there is an increasing ability for lesions to be treated with higher radiation doses each fraction. This also allows for low fractionated treatments. Because the patient is receiving a higher dose of radiation per fraction and because of the fast dose falloff in these targets there must be extreme accuracy in the delivery. The 6 DOF couch allows for extra rotational corrections and for a more accurate set-up. The movement of the couch needs to be verified to be accurate and because of this, end to end quality assurance tests for the couch have been made. After the set-up is known to be accurate then different treatment techniques can be studied. SBRT of the Spine has a very fast dose falloff near the spinal cord and was typically treated with IMRT. Treatment plans generated using this technique tend to have streaks of low dose radiation, so VMAT is being studied to determine if this treatment technique can reduce the low dose radiation volume as well as improve OAR sparing. For the 6 DOF couch QA, graph paper is placed on the anterior and right lateral sides of the VisionRT OSMS Cube Phantom. Each rotational shift is then applied individually, with a 3 degree shift in the positive and negative directions for pitch and roll. A mark is drawn on the paper to record each shift. A CBCT is then taken of the Cube and known shifts are applied and then an additional CBCT is taken to return the Cube to isocenter. The original IMRT plans for SBRT of the Spine are evaluated and then a plan is made utilizing VMAT. These plans are then compared for low dose radiation, OAR sparing, and conformity. If the original IMRT plan is determined to be an inferior treatment to what is acceptable, then this will be re-planned and compared to the VMAT plan. The 6 DOF couch QA tests have proven to be accurate and reproducible. The average deviations in the 3 degree and -3 degree pitch and roll directions were 0.197, 0.068, 0.091, and 0.110 degrees

  1. Contingency Analysis of Cascading Line Outage Events

    SciTech Connect

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  2. Power Line Damage, Electrical Outages Reduced in the ''Sleet Belt'': NICE3 Steel Project Fact Sheet

    SciTech Connect

    2000-04-25

    The AR Windamper System was developed through a grant from the Inventions and Innovation Program, to protect power transmission lines in sleet belt states and provinces by eliminating the ''galloping'' phenomenon. Wind damping products minimize power outages and reduce repair costs to transmission lines.

  3. Outage management and health physics issue, 2007

    SciTech Connect

    Agnihotri, Newal

    2007-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles/reports in this issue include: India: a potential commercial opportunity, a U.S. Department of Commerce Report, by Joe Neuhoff and Justin Rathke; The changing climate for nuclear energy, by Skip Bowman, Nuclear Energy Insitute; Selecting protective clothing, by J. Mark Price, Southern California Edison; and Succssful refurbishment outage, by Sudesh K. Gambhir, Omaha Public Power District. Industry innovation articles in this issue are: Containment radiation monitoring spiking, by Michael W. Lantz and Robert Routolo, Arizona Public Service Company; Improved outage performance, by Michael Powell and Troy Wilfong, Arizona Public Service Company, Palo Verde Nuclear Generating Station; Stop repacking valves and achieve leak-free performance, by Kenneth Hart, PPL Susquehanna LLC; and Head assembly upgrade package, by Timothy Petit, Dominion Nuclear.

  4. Benchmark Report on Key Outage Attributes: An Analysis of Outage Improvement Opportunities and Priorities

    SciTech Connect

    Germain, Shawn St.; Farris, Ronald

    2014-09-01

    Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally do not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.

  5. Nuclear cost control focuses on refueling outages

    SciTech Connect

    Strauss, S.D.

    1995-12-01

    Extending operating cycles and shortening refueling outages are the mainstays of utility efforts to improve the economics of nuclear generation. Here are key management approaches that have contributed to recent successes. Improving operating efficiency remains the byword of nuclear power producers, as they intensify their drive to reduce operation and maintenance (O and M) costs and survive--even thrive--in a competitive environment. Because replacement-power and other costs can incur penalties of $0.5-million or more for each that a nuclear unit is inoperative--and almost $3-million/day, for one utility--refueling outages are an obvious focal point for such efforts, By the same token, the impact on the bottom line is greater and more dramatic here than for other cost-saving activities.

  6. Outage management and health physics issue, 2006

    SciTech Connect

    Agnihotri, Newal

    2006-05-15

    The focus of the May-June issue is on outage management and health physics. Major articles/reports in this issue include: A design with experience for the U.S., by Michael J. Wallace, Constellation Generation Group; Hope to be among the first, by Randy Hutchinson, Entergy Nuclear; Plans to file COLs in 2008, by Garry Miller, Progress Energy; Evolution of ICRP's recommendations, by Lars-Erik Holm, ICRP; European network on education and training in radiological protection, by Michele Coeck, SCK-CEN, Belgium; Outage managment: an important tool for improving nuclear power plant performance, by Thomas Mazour and Jiri Mandula, IAEA, Austria; and Plant profile: Exploring new paths to excellence, by Anne Thomas, Exelon Nuclear.

  7. Advanced Outage and Control Center: Strategies for Nuclear Plant Outage Work Status Capabilities

    SciTech Connect

    Gregory Weatherby

    2012-05-01

    The research effort is a part of the Light Water Reactor Sustainability (LWRS) Program. LWRS is a research and development program sponsored by the Department of Energy, performed in close collaboration with industry to provide the technical foundations for licensing and managing the long-term, safe and economical operation of current nuclear power plants. The LWRS Program serves to help the US nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The Outage Control Center (OCC) Pilot Project was directed at carrying out the applied research for development and pilot of technology designed to enhance safe outage and maintenance operations, improve human performance and reliability, increase overall operational efficiency, and improve plant status control. Plant outage management is a high priority concern for the nuclear industry from cost and safety perspectives. Unfortunately, many of the underlying technologies supporting outage control are the same as those used in the 1980’s. They depend heavily upon large teams of staff, multiple work and coordination locations, and manual administrative actions that require large amounts of paper. Previous work in human reliability analysis suggests that many repetitive tasks, including paper work tasks, may have a failure rate of 1.0E-3 or higher (Gertman, 1996). With between 10,000 and 45,000 subtasks being performed during an outage (Gomes, 1996), the opportunity for human error of some consequence is a realistic concern. Although a number of factors exist that can make these errors recoverable, reducing and effectively coordinating the sheer number of tasks to be performed, particularly those that are error prone, has the potential to enhance outage efficiency and safety. Additionally, outage management requires precise coordination of work groups that do not always share similar objectives. Outage

  8. How individual traces and interactive timelines could support outage execution - Toward an outage historian concept

    SciTech Connect

    Parfouru, S.; De-Beler, N.

    2012-07-01

    In the context of a project that is designing innovative ICT-based solutions for the organizational concept of outage management, we focus on the informational process of the OCR (Outage Control Room) underlying the execution of the outages. Informational process are based on structured and unstructured documents that have a key role in the collaborative processes and management of the outage. We especially track the structured and unstructured documents, electronically or not, from creation to sharing. Our analysis allows us to consider that the individual traces produced by an individual participant with a specific role could be multi-purpose and support sharing between participants without creating duplication of work. The ultimate goal is to be able to generate an outage historian, that is not just focused on highly structured information, which could be useful to improve the continuity of information between participants. We study the implementation of this approach through web technologies and social media tools to address this issue. We also investigate the issue of data access through interactive visualization timelines coupled with other modality's to assist users in the navigation and exploration of the proposed historian. (authors)

  9. Power outages, power externalities, and baby booms.

    PubMed

    Burlando, Alfredo

    2014-08-01

    Determining whether power outages have significant fertility effects is an important policy question in developing countries, where blackouts are common and modern forms of family planning are scarce. Using birth records from Zanzibar, this study shows that a month-long blackout in 2008 caused a significant increase in the number of births 8 to 10 months later. The increase was similar across villages that had electricity, regardless of the level of electrification; villages with no electricity connections saw no changes in birth numbers. The large fertility increase in communities with very low levels of electricity suggests that the outage affected the fertility of households not connected to the grid through some spillover effect. Whether the baby boom is likely to translate to a permanent increase in the population remains unclear, but this article highlights an important hidden consequence of power instability in developing countries. It also suggests that electricity imposes significant externality effects on rural populations that have little exposure to it. PMID:25007970

  10. End-to-End Performance Management for Large Distributed Storage

    SciTech Connect

    Almadena Chtchelkanova

    2012-03-18

    Storage systems for large distributed clusters of computer servers are themselves large and distributed. Their complexity and scale make it hard to ensure that applications using them get good, predictable performance. At the same time, shared access to the system from multiple applications, users, and internal system activities leads to a need for predictable performance. This research investigates mechanisms for improving storage system performance in large distributed storage systems through mechanisms that integrate the performance aspects of the path that I/O operations take through the system, from the application interface on the compute server, through the network, to the storate servers. The research focuses on five parts of the I/O path in a distributed storage system: I/O scheduling at the storage server, storage server cache management, client-to-server network flow control, client-to-server connection management, and client cache management.

  11. End-to-end experiment management in HPC

    SciTech Connect

    Bent, John M; Kroiss, Ryan R; Torrez, Alfred; Wingate, Meghan

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  12. Nursing studies laid end to end form a circle.

    PubMed

    Friss, L

    1994-01-01

    As early as 1915, leaders in the nursing profession were concerned with the "image problem of nurses," which they saw as needing improvement. Since then, countless studies, reports, and commissions have attempted to explain and solve perceived shortages of registered nurses, which have occurred regularly after brief periods of quiescence or oversupply. Usually, their recommendations have hinged on nurses changing their image. In fact, few of these studies have dealt with the real issues of nursing work, which are a narrow pay range, little extra pay for working on undesirable shifts, disincentives for full-time work, pay unrelated to education, and education unconnected to job level. The multiple studies and commissions do nothing more than recycle data and in the process obscure fundamental problems. Educational funding has been no more successful. Their ineffectiveness suggests the need for less "image enhancement" and more support from physicians and employers to bring about systemic reform. This includes licensing nurses according to their education, assigning them according to their competencies and education, and paying accordingly. These measures, and only these, will eventually curtail the cycles of nursing "shortages." PMID:7844324

  13. Kepler Mission: End-to-End System Demonstration

    NASA Technical Reports Server (NTRS)

    Borucki, William; Koch, D.; Dunham, E.; Jenkins, J.; Witteborn, F.; Updike, T.; DeVincenzi, Donald L. (Technical Monitor)

    2000-01-01

    A test facility has been constructed to demonstrate the capability of differential ensemble photometry to detect transits of Earth-size planets orbiting solar-like stars. The main objective is to determine the effects of various noise sources on the capability of a CCD photometer to maintain a system relative precision of 1 x $10^(-5)$ for mv = 12 stars in the presence of system-induced noise sources. The facility includes a simulated star field, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft and computers to perform the onboard control, data processing and extraction. The test structure is thermally and mechanically isolated so that each source of noise can be introduced in a controlled fashion and evaluated for its contribution to the total noise budget. The effects of pointing errors or a changing thermal environment are imposed by piezo-electric devices. Transits are injected by heating small wires crossing apertures in the star plate. Signals as small as those from terrestrial-size transits of solar-like stars are introduced to demonstrate that such planets can be detected under realistic noise conditions. Examples of imposing several noise sources and the resulting detectabilities are presented. These show that a differential ensemble photometric approach CCD photometer can readily detect signals associated with Earth-size transits.

  14. On Estimating End-to-End Network Path Properties

    NASA Technical Reports Server (NTRS)

    Allman, Mark; Paxson, Vern

    1999-01-01

    The more information about current network conditions available to a transport protocol, the more efficiently it can use the network to transfer its data. In networks such as the Internet, the transport protocol must often form its own estimates of network properties based on measurements per-formed by the connection endpoints. We consider two basic transport estimation problems: determining the setting of the retransmission timer (RTO) for are reliable protocol, and estimating the bandwidth available to a connection as it begins. We look at both of these problems in the context of TCP, using a large TCP measurement set [Pax97b] for trace-driven simulations. For RTO estimation, we evaluate a number of different algorithms, finding that the performance of the estimators is dominated by their minimum values, and to a lesser extent, the timer granularity, while being virtually unaffected by how often round-trip time measurements are made or the settings of the parameters in the exponentially-weighted moving average estimators commonly used. For bandwidth estimation, we explore techniques previously sketched in the literature [Hoe96, AD98] and find that in practice they perform less well than anticipated. We then develop a receiver-side algorithm that performs significantly better.

  15. Proposal of an end-to-end emergency medical system.

    PubMed

    El-Masri, Samir; Saddik, Basema

    2011-01-01

    A new comprehensive emergency system has been proposed to facilitate and computerize all the processes involved in an emergency from the initial contact to the ambulance emergency system, to finding the right and nearest available ambulance, and through to accessing a Smart Online Electronic Health Record (SOEHR). The proposed system will critically assist in pre-hospital treatments, indentify availability of the nearest available specialized hospital and communicate with the Hospital Emergency Department System (HEDS) to provide early information about the incoming patient for preparation to receive and assist. PMID:21893771

  16. Going End to End to Deliver High-Speed Data

    NASA Technical Reports Server (NTRS)

    2005-01-01

    By the end of the 1990s, the optical fiber "backbone" of the telecommunication and data-communication networks had evolved from megabits-per-second transmission rates to gigabits-per-second transmission rates. Despite this boom in bandwidth, however, users at the end nodes were still not being reached on a consistent basis. (An end node is any device that does not behave like a router or a managed hub or switch. Examples of end node objects are computers, printers, serial interface processor phones, and unmanaged hubs and switches.) The primary reason that prevents bandwidth from reaching the end nodes is the complex local network topology that exists between the optical backbone and the end nodes. This complex network topology consists of several layers of routing and switch equipment which introduce potential congestion points and network latency. By breaking down the complex network topology, a true optical connection can be achieved. Access Optical Networks, Inc., is making this connection a reality with guidance from NASA s nondestructive evaluation experts.

  17. Scalable end-to-end ATM encryption test results

    SciTech Connect

    Pierson, L.G.

    1995-10-01

    Customers of Asynchronous Transfer Mode (ATM) services may need a variety of data authenticity and privacy assurances. Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale for implementation at high speed. The incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. To study these trade-offs, a prototype encryptor/decryptor was developed. This effort demonstrated the viability of implementing certain encryption techniques in high speed networks. The research prototype processes ATM cells in a SONET OC-3 payload. This paper describes the functionality, reliability, security, and performance design trade-offs investigated with the prototype.

  18. End-to-end modelling of He II flow systems

    NASA Technical Reports Server (NTRS)

    Mord, A. J.; Snyder, H. A.; Newell, D. A.

    1992-01-01

    A practical computer code has been developed which uses the accepted two-fluid model to simulate He II flow in complicated systems. The full set of equations are used, retaining the coupling between the pressure, temperature and velocity fields. This permits modeling He II flow over the full range of conditions, from strongly or weakly driven flow through large pipes, narrow channels and porous media. The system may include most of the components used in modern superfluid flow systems: non-ideal thermomechanical pumps, tapered sections, constrictions, lines with heated side walls and heat exchangers. The model is validated by comparison with published experimental data. It is applied to a complex system to show some of the non-intuitive feedback effects that can occur. This code is ready to be used as a design tool for practical applications of He II. It can also be used for the design of He II experiments and as a tool for comparison of experimental data with the standard two-fluid model.

  19. Outage capacity and outage rate performance of MIMO free-space optical system over strong turbulence channel

    NASA Astrophysics Data System (ADS)

    Hasan, Omar M.; Taha, Mohamed; Abu Sharkh, Osama

    2016-06-01

    In this paper, we investigate outage capacity, outage probability, and outage rate performance of multiple-input multiple-output (MIMO) free-space optical system operating over strong turbulence channels. The MIMO optical system employs intensity modulation direct detection with on-off signaling, and equal gain combining technique at the receiver. We derived novel closed-form expressions for three system metrics, namely, outage capacity, outage probability, and outage rate. Expressions derived here are based on the generalized Gamma-Gamma channel model, which is based on scintillation theory that assumes that the irradiance of the received optical wave is modeled as the product of small-scale and large-scale turbulence eddies. The results are evaluated for different values of received signal-to-noise ratios, strong turbulence conditions, and several values of transmit/receive diversity.

  20. Development of Improved Graphical Displays for an Advanced Outage Control Center, Employing Human Factors Principles for Outage Schedule Management

    SciTech Connect

    St Germain, Shawn Walter; Farris, Ronald Keith; Thomas, Kenneth David

    2015-09-01

    The long-term viability of existing nuclear power plants in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are somewhat challenging to coordinate; therefore, finding ways to improve refueling outage performance, while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center (AOCC) project is a research and development (R&D) demonstration activity under the LWRS Program. LWRS is an R&D program that works closely with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current fleet of NPPs. As such, the LWRS Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, INL is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. The overall focus is on developing an AOCC with the following capabilities that enables plant and OCC staff to; Collaborate in real-time to address emergent issues; Effectively communicate outage status to all workers involved in the outage; Effectively communicate discovered conditions in the field to the OCC; Provide real-time work status; Provide automatic pending support notifications

  1. Improved outage management techniques for better plant availability

    SciTech Connect

    Bemer, J.P.

    1989-01-01

    To maintain high availability of nuclear generating units is one of the most important management objectives. The duration of outages-whether planned or unplanned-is the main parameter impacting on plant availability, but the planned outages, and essentially the refueling outages, are the most important in this respect, and they also have a heavy impact on the economics of plant operation. The following factors influence the duration of the outages: (1) modifications; (2) preventive maintenance operations; and (3) corrective maintenance operations of generic faults. In this paper, the authors examine how the outage management organization of Electricite de France (EdF) plants is tending to optimize the solutions to the above-mentioned points.

  2. Minimally Invasive Approach to Achilles Tendon Pathology.

    PubMed

    Hegewald, Kenneth W; Doyle, Matthew D; Todd, Nicholas W; Rush, Shannon M

    2016-01-01

    Many surgical procedures have been described for Achilles tendon pathology; however, no overwhelming consensus has been reached for surgical treatment. Open repair using a central or paramedian incision allows excellent visualization for end-to-end anastomosis in the case of a complete rupture and detachment and reattachment for insertional pathologies. Postoperative wound dehiscence and infection in the Achilles tendon have considerable deleterious effects on overall functional recovery and outcome and sometimes require plastic surgery techniques to achieve coverage. With the aim of avoiding such complications, foot and ankle surgeons have studied less invasive techniques for repair. We describe a percutaneous approach to Achilles tendinopathy using a modification of the Bunnell suture weave technique combined with the use of interference screws. No direct end-to-end repair of the tendon is performed, rather, the proximal stump is brought in direct proximity of the distal stump, preventing overlengthening and proximal stump retraction. This technique also reduces the suture creep often seen with end-to-end tendon repair by providing a direct, rigid suture to bone interface. We have used the new technique to minimize dissection and exposure while restoring function and accelerating recovery postoperatively. PMID:26385574

  3. Technology Integration Initiative In Support of Outage Management

    SciTech Connect

    Gregory Weatherby; David Gertman

    2012-07-01

    Plant outage management is a high priority concern for the nuclear industry from cost and safety perspectives. Often, command and control during outages is maintained in the outage control center where many of the underlying technologies supporting outage control are the same as those used in the 1980’s. This research reports on the use of advanced integrating software technologies and hand held mobile devices as a means by which to reduce cycle time, improve accuracy, and enhance transparency among outage team members. This paper reports on the first phase of research supported by the DOE Light Water Reactor Sustainability (LWRS) Program that is performed in close collaboration with industry to examine the introduction of newly available technology allowing for safe and efficient outage performance. It is thought that this research will result in: improved resource management among various plant stakeholder groups, reduced paper work, and enhanced overall situation awareness for the outage control center management team. A description of field data collection methods, including personnel interview data, success factors, end-user evaluation and integration of hand held devices in achieving an integrated design are also evaluated. Finally, the necessity of obtaining operations cooperation support in field studies and technology evaluation is acknowledged.

  4. Outages of electric power supply resulting from cable failures Boston Edison Company system

    SciTech Connect

    1980-07-01

    Factual data are provided regarding 5 electric power supply interruptions that occurred in the Boston Metropolitan area during April to June, 1979. Common to all of these outages was the failure of an underground cable as the initiating event, followed by multiple equipment failures. There was significant variation in the voltage ratings and types of cables which failed. The investigation was unable to delineate a single specific Boston Edison design operating or maintenance practice that could be cited as the cause of the outages. After reviewing the investigative report the following actions were recommended: the development and implementation of a plan to eliminate the direct current cable network; develop a network outage restoration plan; regroup primary feeder cables wherever possible to minimize the number of circuits in manholes, and to separate feeders to high load density areas; develop a program to detect incipient cable faults; evaluate the separation of the north and south sections of Back Bay network into separate networks; and, as a minimum, install the necessary facilities to make it possible to re-energize one section without interfering with the other; and re-evaluate the cathodic protection scheme where necessary. (LCL)

  5. SAMPLE RESULTS FROM MCU SOLIDS OUTAGE

    SciTech Connect

    Peters, T.; Washington, A.; Oji, L.; Coleman, C.; Poirier, M.

    2014-09-22

    Savannah River National Laboratory (SRNL) has received several solid and liquid samples from MCU in an effort to understand and recover from the system outage starting on April 6, 2014. SRNL concludes that the presence of solids in the Salt Solution Feed Tank (SSFT) is the likely root cause for the outage, based upon the following discoveries  A solids sample from the extraction contactor #1 proved to be mostly sodium oxalate  A solids sample from the scrub contactor#1 proved to be mostly sodium oxalate  A solids sample from the Salt Solution Feed Tank (SSFT) proved to be mostly sodium oxalate  An archived sample from Tank 49H taken last year was shown to contain a fine precipitate of sodium oxalate  A solids sample from the extraction contactor #1 drain pipe from extraction contactor#1 proved to be mostly sodium aluminosilicate  A liquid sample from the SSFT was shown to have elevated levels of oxalate anion compared to the expected concentration in the feed Visual inspection of the SSFT indicated the presence of precipitated or transferred solids, which were likely also in the Salt Solution Receipt Tank (SSRT). The presence of the solids coupled with agitation performed to maintain feed temperature resulted in oxalate solids migration through the MCU system and caused hydraulic issues that resulted in unplanned phase carryover from the extraction into the scrub, and ultimately the strip contactors. Not only did this carryover result in the Strip Effluent (SE) being pushed out of waste acceptance specification, but it resulted in the deposition of solids into several of the contactors. At the same time, extensive deposits of aluminosilicates were found in the drain tube in the extraction contactor #1. However it is not known at this time how the aluminosilicate solids are related to the oxalate solids. The solids were successfully cleaned out of the MCU system. However, future consideration must be given to the exclusion of oxalate solids into

  6. Automatic outage reporting through in-home monitors

    SciTech Connect

    Wyse, G.D.

    1994-12-31

    Customer service and customer satisfaction initiatives take as many varied forms as there are perceptions of the words service and satisfaction. Too often utilities define service and satisfaction in ways that they understand but not in accordance with customers` expectations. Outage reporting and restoration of service is no exception to these expectations. Utilities expect customers to call when they are out of power while many customers expect the utility to know when the power is off. Also, customers` expectations are changing in regards to momentary outages caused by automatic switching devices. Where momentary interruptions were once an inconvenience, some customers are including resultant lost product and sales as a part of the total cost of energy. Redefining the interface between the customer and the utility from the customer`s perspective is fundamental in beginning to serve customers on their terms and expectations. Questions that address one`s vision of how outages are reported and the customers role in the restoration process surfaces the need to reduce customer frustration and involvement in reporting outages while increasing the information needed to restore service efficiently and provide early detection of developing system problems. A customer installed monitoring device has been developed that removes them completely from the process of reporting outages while providing restoration and momentary interruption information. The following discusses the implementation and roll-out of an in-progress pilot to test this technology.

  7. Overview of Common Mode Outages in Power Systems

    SciTech Connect

    Papic, Milorad; Awodele , Kehinde; Billinton, Roy; Dent, Chris; Eager, Dan; Hamoud, Gomaa; Jirutitijaroen, Panida; Kumbale, Murali; Mitra, Joydeep; Samaan, Nader A.; Schneider, Alex; Singh, Chanan

    2012-11-10

    This paper is a result of ongoing activity carried out by Probability Applications for Common Mode Events (PACME) Task Force under the Reliability Risk and Probability Applications (RRPA) Subcommittee. The paper is intended to constitute a valid source of information and references about dealing with common-mode outages in power systems reliability analysis. This effort involves reviewing published literature and presenting state-of-the-art research and practical applications in the area of common-mode outages. Evaluation of available outage statistics show that there is a definite need for collective effort from academia and industry to not only recommended procedures for data collection and monitoring but also to provide appropriate mathematical models to assess such events.

  8. Tritium Reduction and Control in the Vacuum Vessel during TFTR Outage and Decommissioning

    SciTech Connect

    Blanchard, W.; Camp, R.; Carnevale, H.; Casey, M.; Collins, J.; et al

    1997-11-01

    In the summer/fall of 1996 after nearly three years of D-T operations, TFTR underwent an extended outage during which large port covers were removed from the vacuum vessel in order to complete upgrades to the tokamak. Following the venting of the torus, a three tier system was developed for the outage in order to reduce and control the free tritium in the vacuum vessel so as to minimize the exposure to personnel during port cover removal and reinstallation. The first phase of the program to reduce the free tritium consisted of direct flowthrough of room air through the vacuum vessel to the molecular sieve beds using the Torus Cleanup System. Real time measurements of the effluent tritium concentration were used to derive the amount of tritium removed from the torus. Once the free tritium in the vessel had been reduced to approximately 50 Ci, a second phase was initiated using a 55 Gallon Drum Bubbler System for the direct processing of the vacuum vessel to further lower the tritium level in the torus. Tritium oxide is absorbed by the bubbler system with the exhaust vented to one of the tritium monitored HVAC ventilation stacks. To preclude the release of tritium to the Test Cell location of TFTR and to minimize the exposure of workers, a variable flow exhaust system was employed in order to maintain a negative pressure in the vacuum vessel between 0.05" and 1.5" w.c. during the removal of port covers ranging in size from approximately 5 to 1000 in(superscript2). These systems were completely successful in reducing and controlling the free tritium in TFTR and were instrumental in maintaining ALARA (As Low As Reasonably Achievable) exposures to tritium during the 1996 outage. These systems are again being utilized during the safe shutdown and decommissioning of TFTR which commenced in April of 1997. This paper describes in detail the configuration of these systems and the data obtained during the outage and safe shutdown of TFTR.

  9. Using Predictive Analytics to Predict Power Outages from Severe Weather

    NASA Astrophysics Data System (ADS)

    Wanik, D. W.; Anagnostou, E. N.; Hartman, B.; Frediani, M. E.; Astitha, M.

    2015-12-01

    The distribution of reliable power is essential to businesses, public services, and our daily lives. With the growing abundance of data being collected and created by industry (i.e. outage data), government agencies (i.e. land cover), and academia (i.e. weather forecasts), we can begin to tackle problems that previously seemed too complex to solve. In this session, we will present newly developed tools to aid decision-support challenges at electric distribution utilities that must mitigate, prepare for, respond to and recover from severe weather. We will show a performance evaluation of outage predictive models built for Eversource Energy (formerly Connecticut Light & Power) for storms of all types (i.e. blizzards, thunderstorms and hurricanes) and magnitudes (from 20 to >15,000 outages). High resolution weather simulations (simulated with the Weather and Research Forecast Model) were joined with utility outage data to calibrate four types of models: a decision tree (DT), random forest (RF), boosted gradient tree (BT) and an ensemble (ENS) decision tree regression that combined predictions from DT, RF and BT. The study shows that the ENS model forced with weather, infrastructure and land cover data was superior to the other models we evaluated, especially in terms of predicting the spatial distribution of outages. This research has the potential to be used for other critical infrastructure systems (such as telecommunications, drinking water and gas distribution networks), and can be readily expanded to the entire New England region to facilitate better planning and coordination among decision-makers when severe weather strikes.

  10. Assessment of Critical Events Corridors through Multivariate Cascading Outages Analysis

    SciTech Connect

    Makarov, Yuri V.; Samaan, Nader A.; Diao, Ruisheng; Kumbale, Murali; Chen, Yousu; Singh, Ruchi; Green, Irina; Morgan, Mark P.

    2011-10-17

    Massive blackouts of electrical power systems in North America over the past decade has focused increasing attention upon ways to identify and simulate network events that may potentially lead to widespread network collapse. This paper summarizes a method to simulate power-system vulnerability to cascading failures to a supplied set of initiating events synonymously termed as Extreme Events. The implemented simulation method is currently confined to simulating steady state power-system response to a set of extreme events. The outlined method of simulation is meant to augment and provide a new insight into bulk power transmission network planning that at present remains mainly confined to maintaining power system security for single and double component outages under a number of projected future network operating conditions. Although one of the aims of this paper is to demonstrate the feasibility of simulating network vulnerability to cascading outages, a more important goal has been to determine vulnerable parts of the network that may potentially be strengthened in practice so as to mitigate system susceptibility to cascading failures. This paper proposes to demonstrate a systematic approach to analyze extreme events and identify vulnerable system elements that may be contributing to cascading outages. The hypothesis of critical events corridors is proposed to represent repeating sequential outages that can occur in the system for multiple initiating events. The new concept helps to identify system reinforcements that planners could engineer in order to 'break' the critical events sequences and therefore lessen the likelihood of cascading outages. This hypothesis has been successfully validated with a California power system model.

  11. A framework and review of customer outage costs: Integration and analysis of electric utility outage cost surveys

    SciTech Connect

    Lawton, Leora; Sullivan, Michael; Van Liere, Kent; Katz, Aaron; Eto, Joseph

    2003-11-01

    A clear understanding of the monetary value that customers place on reliability and the factors that give rise to higher and lower values is an essential tool in determining investment in the grid. The recent National Transmission Grid Study recognizes the need for this information as one of growing importance for both public and private decision makers. In response, the U.S. Department of Energy has undertaken this study, as a first step toward addressing the current absence of consistent data needed to support better estimates of the economic value of electricity reliability. Twenty-four studies, conducted by eight electric utilities between 1989 and 2002 representing residential and commercial/industrial (small, medium and large) customer groups, were chosen for analysis. The studies cover virtually all of the Southeast, most of the western United States, including California, rural Washington and Oregon, and the Midwest south and east of Chicago. All variables were standardized to a consistent metric and dollar amounts were adjusted to the 2002 CPI. The data were then incorporated into a meta-database in which each outage scenario (e.g., the lost of electric service for one hour on a weekday summer afternoon) is treated as an independent case or record both to permit comparisons between outage characteristics and to increase the statistical power of analysis results. Unadjusted average outage costs and Tobit models that estimate customer damage functions are presented. The customer damage functions express customer outage costs for a given outage scenario and customer class as a function of location, time of day, consumption, and business type. One can use the damage functions to calculate outage costs for specific customer types. For example, using the customer damage functions, the cost experienced by an ''average'' customer resulting from a 1 hour summer afternoon outage is estimated to be approximately $3 for a residential customer, $1,200 for small

  12. Development and demonstration of techniques for reducing occupational radiation doses during refueling outages. Tasks 7A/7B. Advanced outage management and radiation exposure control

    SciTech Connect

    Not Available

    1985-03-01

    Objectives of Tasks 7A and 7B were to develop and demonstrate computer based systems to assist plant management and staff in utilizing information more effectively to reduce occupational exposures received as a result of refueling outages, and to shorten the duration of the outage. The Advanced Outage Management (AOM) Tool (Task 7A) is an automated outage planning system specifically designed to meet the needs of nuclear plant outage management. The primary objective of the AOM tool is to provide a computerized system that can manipulate the information typically associated with outage planning and scheduling to furnish reports and schedules that more accurately project the future course of the outage. The Radiation Exposure Control (REC) Tool (Task 7B) is a computerized personnel radiation exposure accounting and management system designed to enable nuclear plant management to project and monitor total personnel radiation exposure on a real-time basis. The two systems were designed to operate on the same computer system and interface through a common database that enables information sharing between plant organizations not typically interfaced. This interfacing provides outage planners with a means of incorporating occupational radiation exposure as a factor for making decisions on the course of an outage.

  13. Risk Assessment of Cascading Outages: Methodologies and Challenges

    SciTech Connect

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2012-05-31

    Abstract- This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses different approaches to this and suggests directions for future development of methodologies. The second paper summarizes the state of the art in modeling tools for risk assessment of cascading outages.

  14. Design Concepts for an Outage Control Center Information Dashboard

    SciTech Connect

    Hugo, Jacques Victor; St Germain, Shawn Walter; Thompson, Cheradan Jo; Whitesides, McKenzie Jo; Farris, Ronald Keith

    2015-12-01

    The nuclear industry, and the business world in general, is facing a rapidly increasing amount of data to be dealt with on a daily basis. In the last two decades, the steady improvement of data storage devices and means to create and collect data along the way influenced the manner in which we deal with information. Most data is still stored without filtering and refinement for later use. Many functions at a nuclear power plant generate vast amounts of data, with scheduled and unscheduled outages being a prime example of a source of some of the most complex data sets at the plant. To make matters worse, modern information and communications technology is making it possible to collect and store data faster than our ability to use it for making decisions. However, in most applications, especially outages, raw data has no value in itself; instead, managers, engineers and other specialists want to extract the information contained in it. The complexity and sheer volume of data could lead to information overload, resulting in getting lost in data that may be irrelevant to the task at hand, processed in an inappropriate way, or presented in an ineffective way. To prevent information overload, many data sources are ignored so production opportunities are lost because utilities lack the ability to deal with the enormous data volumes properly. Decision-makers are often confronted with large amounts of disparate, conflicting and dynamic information, which are available from multiple heterogeneous sources. Information and communication technologies alone will not solve this problem. Utilities need effective methods to exploit and use the hidden opportunities and knowledge residing in unexplored data resources. Superior performance before, during and after outages depends upon the right information being available at the right time to the right people. Acquisition of raw data is the easy part; instead, it is the ability to use advanced analytical, data processing and data

  15. Risk Assessment of Cascading Outages: Part I - Overview of Methodologies

    SciTech Connect

    Vaiman, Marianna; Bell, Keith; Chen, Yousu; Chowdhury, Badrul; Dobson, Ian; Hines, Paul; Papic, Milorad; Miller, Stephen; Zhang, Pei

    2011-07-31

    This paper is a result of ongoing activity carried out by Understanding, Prediction, Mitigation and Restoration of Cascading Failures Task Force under IEEE Computer Analytical Methods Subcommittee (CAMS). The task force's previous papers are focused on general aspects of cascading outages such as understanding, prediction, prevention and restoration from cascading failures. This is the first of two new papers, which will extend this previous work to summarize the state of the art in cascading failure risk analysis methodologies and modeling tools. This paper is intended to be a reference document to summarize the state of the art in the methodologies for performing risk assessment of cascading outages caused by some initiating event(s). A risk assessment should cover the entire potential chain of cascades starting with the initiating event(s) and ending with some final condition(s). However, this is a difficult task and heuristic approaches and approximations have been suggested. This paper discusses diffeent approaches to this and suggests directions for future development of methodologies.

  16. The optical communication link outage probability in satellite formation flying

    NASA Astrophysics Data System (ADS)

    Arnon, Shlomi; Gill, Eberhard

    2014-02-01

    In recent years, several space systems consisting of multiple satellites flying in close formation have been proposed for various purposes such as interferometric synthetic aperture radar measurement (TerraSAR-X and the TanDEM-X), detecting extra-solar earth-like planets (Terrestrial Planet Finder-TPF and Darwin), and demonstrating distributed space systems (DARPA F6 project). Another important purpose, which is the concern of this paper, is for improving radio frequency communication to mobile terrestrial and maritime subscribers. In this case, radio frequency signals from several satellites coherently combine such that the received/transmit signal strength is increased proportionally with the number of satellites in the formation. This increase in signal strength allows to enhance the communication data rate and/or to reduce energy consumption and the antenna size of terrestrial mobile users' equipment. However, a coherent combination of signals without aligning the phases of the individual communication signals interrupts the communication and outage link between the satellites and the user. The accuracy of the phase estimation is a function of the inter-satellite laser ranging system performance. This paper derives an outage probability model of a coherent combination communication system as a function of the pointing vibration and jitter statistics of an inter-satellite laser ranging system tool. The coherent combination probability model, which could be used to improve the communication to mobile subscribers in air, sea and ground is the main importance of this work.

  17. Differentially Private Empirical Risk Minimization

    PubMed Central

    Chaudhuri, Kamalika; Monteleoni, Claire; Sarwate, Anand D.

    2011-01-01

    Privacy-preserving machine learning algorithms are crucial for the increasingly common setting in which personal data, such as medical or financial records, are analyzed. We provide general techniques to produce privacy-preserving approximations of classifiers learned via (regularized) empirical risk minimization (ERM). These algorithms are private under the ε-differential privacy definition due to Dwork et al. (2006). First we apply the output perturbation ideas of Dwork et al. (2006), to ERM classification. Then we propose a new method, objective perturbation, for privacy-preserving machine learning algorithm design. This method entails perturbing the objective function before optimizing over classifiers. If the loss and regularizer satisfy certain convexity and differentiability criteria, we prove theoretical results showing that our algorithms preserve privacy, and provide generalization bounds for linear and nonlinear kernels. We further present a privacy-preserving technique for tuning the parameters in general machine learning algorithms, thereby providing end-to-end privacy guarantees for the training process. We apply these results to produce privacy-preserving analogues of regularized logistic regression and support vector machines. We obtain encouraging results from evaluating their performance on real demographic and benchmark data sets. Our results show that both theoretically and empirically, objective perturbation is superior to the previous state-of-the-art, output perturbation, in managing the inherent tradeoff between privacy and learning performance. PMID:21892342

  18. Synthesis of power plant outage schedules. Final technical report, April 1995-January 1996

    SciTech Connect

    Smith, D.R.

    1997-07-01

    This document provides a report on the creation of domain theories in the power plant outage domain. These were developed in conjunction with the creation of a demonstration system of advanced scheduling technology for the outage problem. In 1994 personnel from Rome Laboratory (RL), Kaman Science (KS), Kestrel Institute, and the Electric Power Research Institute (EPRI) began a joint project to develop scheduling tools for power plant outage activities. This report describes our support for this joint effort. The project uses KIDS (Kestrel Interactive Development System) to generate schedulers from formal specifications of the power plant domain outage activities.

  19. Braess's paradox in oscillator networks, desynchronization and power outage

    NASA Astrophysics Data System (ADS)

    Witthaut, Dirk; Timme, Marc

    2012-08-01

    Robust synchronization is essential to ensure the stable operation of many complex networked systems such as electric power grids. Increasing energy demands and more strongly distributing power sources raise the question of where to add new connection lines to the already existing grid. Here we study how the addition of individual links impacts the emergence of synchrony in oscillator networks that model power grids on coarse scales. We reveal that adding new links may not only promote but also destroy synchrony and link this counter-intuitive phenomenon to Braess's paradox known for traffic networks. We analytically uncover its underlying mechanism in an elementary grid example, trace its origin to geometric frustration in phase oscillators, and show that it generically occurs across a wide range of systems. As an important consequence, upgrading the grid requires particular care when adding new connections because some may destabilize the synchronization of the grid—and thus induce power outages.

  20. 47 CFR 4.9 - Outage reporting requirements-threshold criteria.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Outage reporting requirements-threshold... they own, operate, lease, or otherwise utilize, an outage of at least 30 minutes duration that: (1) Potentially affects at least 900,000 user minutes of telephony service; (2) Affects at least 1,350 DS3...

  1. 47 CFR 4.9 - Outage reporting requirements-threshold criteria.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Outage reporting requirements-threshold criteria. 4.9 Section 4.9 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL DISRUPTIONS TO COMMUNICATIONS Reporting Requirements for Disruptions to Communications § 4.9 Outage reporting requirements—threshold criteria. (a) Cable. All...

  2. 47 CFR 4.9 - Outage reporting requirements-threshold criteria.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Outage reporting requirements-threshold criteria. 4.9 Section 4.9 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL DISRUPTIONS TO COMMUNICATIONS Reporting Requirements for Disruptions to Communications § 4.9 Outage reporting requirements—threshold criteria. (a) Cable. All...

  3. 47 CFR 4.9 - Outage reporting requirements-threshold criteria.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Outage reporting requirements-threshold... they own, operate, lease, or otherwise utilize, an outage of at least 30 minutes duration that: (1) Potentially affects at least 900,000 user minutes of telephony service; (2) Affects at least 1,350 DS3...

  4. Seasonal and Local Characteristics of Lightning Outages of Power Distribution Lines in Hokuriku Area

    NASA Astrophysics Data System (ADS)

    Sugimoto, Hitoshi; Shimasaki, Katsuhiko

    The proportion of the lightning outages in all outages on Japanese 6.6kV distribution lines is high with approximately 20 percent, and then lightning protections are very important for supply reliability of 6.6kV lines. It is effective for the lightning performance to apply countermeasures in order of the area where a large number of the lightning outages occur. Winter lightning occurs in Hokuriku area, therefore it is also important to understand the seasonal characteristics of the lightning outages. In summer 70 percent of the lightning outages on distribution lines in Hokuriku area were due to sparkover, such as power wire breakings and failures of pole-mounted transformers. However, in winter almost half of lightning-damaged equipments were surge arrester failures. The number of the lightning outages per lightning strokes detected by the lightning location system (LLS) in winter was 4.4 times larger than that in summer. The authors have presumed the occurrence of lightning outages from lightning stroke density, 50% value of lightning current and installation rate of lightning protection equipments and overhead ground wire by multiple regression analysis. The presumed results suggest the local difference in the lightning outages.

  5. 47 CFR 4.9 - Outage reporting requirements-threshold criteria.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Outage reporting requirements-threshold criteria. 4.9 Section 4.9 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL DISRUPTIONS TO COMMUNICATIONS Reporting Requirements for Disruptions to Communications § 4.9 Outage reporting requirements—threshold criteria. (a) Cable. All...

  6. Use of collaboration software to improve nuclear power plant outage management

    SciTech Connect

    Germain, Shawn

    2015-02-01

    Nuclear Power Plant (NPP) refueling outages create some of the most challenging activities the utilities face in both tracking and coordinating thousands of activities in a short period of time. Other challenges, including nuclear safety concerns arising from atypical system configurations and resource allocation issues, can create delays and schedule overruns, driving up outage costs. Today the majority of the outage communication is done using processes that do not take advantage of advances in modern technologies that enable enhanced communication, collaboration and information sharing. Some of the common practices include: runners that deliver paper-based requests for approval, radios, telephones, desktop computers, daily schedule printouts, and static whiteboards that are used to display information. Many gains have been made to reduce the challenges facing outage coordinators; however; new opportunities can be realized by utilizing modern technological advancements in communication and information tools that can enhance the collective situational awareness of plant personnel leading to improved decision-making. Ongoing research as part of the Light Water Reactor Sustainability Program (LWRS) has been targeting NPP outage improvement. As part of this research, various applications of collaborative software have been demonstrated through pilot project utility partnerships. Collaboration software can be utilized as part of the larger concept of Computer-Supported Cooperative Work (CSCW). Collaborative software can be used for emergent issue resolution, Outage Control Center (OCC) displays, and schedule monitoring. Use of collaboration software enables outage staff and subject matter experts (SMEs) to view and update critical outage information from any location on site or off.

  7. Estimating Power Outage Cost based on a Survey for Industrial Customers

    NASA Astrophysics Data System (ADS)

    Yoshida, Yoshikuni; Matsuhashi, Ryuji

    A survey was conducted on power outage cost for industrial customers. 5139 factories, which are designated energy management factories in Japan, answered their power consumption and the loss of production value due to the power outage in an hour in summer weekday. The median of unit cost of power outage of whole sectors is estimated as 672 yen/kWh. The sector of services for amusement and hobbies and the sector of manufacture of information and communication electronics equipment relatively have higher unit cost of power outage. Direct damage cost from power outage in whole sectors reaches 77 billion yen. Then utilizing input-output analysis, we estimated indirect damage cost that is caused by the repercussion of production halt. Indirect damage cost in whole sectors reaches 91 billion yen. The sector of wholesale and retail trade has the largest direct damage cost. The sector of manufacture of transportation equipment has the largest indirect damage cost.

  8. Power Outages, Extreme Events and Health: a Systematic Review of the Literature from 2011-2012

    PubMed Central

    Klinger, Chaamala; Landeg, Owen; Murray, Virginia

    2014-01-01

    Background Extreme events (e.g. flooding) threaten critical infrastructure including power supplies. Many interlinked systems in the modern world depend on a reliable power supply to function effectively. The health sector is no exception, but the impact of power outages on health is poorly understood. Greater understanding is essential so that adverse health impacts can be prevented and/or mitigated. Methods We searched Medline, CINAHL and Scopus for papers about the health impacts of power outages during extreme events published in 2011-2012. A thematic analysis was undertaken on the extracted information. The Public Health England Extreme Events Bulletins between 01/01/2013 - 31/03/2013 were used to identify extreme events that led to power outages during this three-month period. Results We identified 20 relevant articles. Power outages were found to impact health at many levels within diverse settings. Recurrent themes included the difficulties of accessing healthcare, maintaining frontline services and the challenges of community healthcare. We identified 52 power outages in 19 countries that were the direct consequence of extreme events during the first three months of 2013. Conclusions To our knowledge, this is the first review of the health impacts of power outages. We found the current evidence and knowledge base to be poor. With scientific consensus predicting an increase in the frequency and magnitude of extreme events due to climate change, the gaps in knowledge need to be addressed in order to mitigate the impact of power outages on global health. PMID:24459613

  9. Minimal Reduplication

    ERIC Educational Resources Information Center

    Kirchner, Jesse Saba

    2010-01-01

    This dissertation introduces Minimal Reduplication, a new theory and framework within generative grammar for analyzing reduplication in human language. I argue that reduplication is an emergent property in multiple components of the grammar. In particular, reduplication occurs independently in the phonology and syntax components, and in both cases…

  10. Taxonomic minimalism.

    PubMed

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. PMID:21236933

  11. An end-to-end assessment of extreme weather impacts on food security

    NASA Astrophysics Data System (ADS)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  12. SciBox, an end-to-end automated science planning and commanding system

    NASA Astrophysics Data System (ADS)

    Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott

    2014-01-01

    SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.

  13. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGESBeta

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-01-01

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  14. End-to-End Data Movement Using MPI-IO Over Routed Terabots Infrastructures

    SciTech Connect

    Vallee, Geoffroy R; Atchley, Scott; Kim, Youngjae; Shipman, Galen M

    2013-01-01

    Scientific discovery is nowadays driven by large-scale simulations running on massively parallel high-performance computing (HPC) systems. These applications each generate a large amount of data, which then needs to be post-processed for example for data mining or visualization. Unfortunately, the computing platform used for post processing might be different from the one on which the data is initially generated, introducing the challenge of moving large amount of data between computing platforms. This is especially challenging when these two platforms are geographically separated since the data needs to be moved between computing facilities. This is even more critical when scientists tightly couple their domain specific applications with a post processing application. The paper presents a solution for the data transfer between MPI applications using a dedicated wide area network (WAN) terabit infrastructure. The proposed solution is based on parallel access to data files and the Message Passing Interface (MPI) over the Common Communication Infrastructure (CCI) for the data transfer over a routed infrastructure. In the context of this research, the Energy Sciences Network (ESnet) of the U.S. Department of Energy (DOE) is targeted for the transfer of data between DOE national laboratories.

  15. End-to-end quality measure for transmission of compressed imagery over a noisy coded channel

    NASA Technical Reports Server (NTRS)

    Korwar, V. N.; Lee, P. J.

    1981-01-01

    For the transmission of imagery at high data rates over large distances with limited power and system gain, it is usually necessary to compress the data before transmitting it over a noisy channel that uses channel coding to reduce the effect of noise introduced errors. Both compression and channel noise introduce distortion into the imagery. In order to design a communication link that provides adequate quality of received images, it is necessary first to define some suitable distortion measure that accounts for both these kinds of distortion and then to perform various tradeoffs to arrive at system parameter values that will provide a sufficiently low level of received image distortion. The overall mean square error is used as the distortion measure and a description of how to perform these tradeoffs are included.

  16. End-to-end design consideration of a radar altimeter for terrain-aided navigation

    NASA Astrophysics Data System (ADS)

    Chun, Joohwan; Choi, Sanghyouk; Paek, Inchan; Park, Dongmin; Yoo, Kyungju

    2013-10-01

    We present a preliminary simulation study of an interferometric SAR altimeter for the terrain-aided navigation application. Our simulation includes raw SAR data generation, azimuth compression, leading edge detection of the echo signal, maximum likelihood angle estimation and the Bayesian state estimation. Sour results show that radar altimeter performance can be improved with the feedback loop from the rear-end navigation part.

  17. Data compression: The end-to-end information systems perspective for NASA space science missions

    NASA Technical Reports Server (NTRS)

    Tai, Wallace

    1991-01-01

    The unique characteristics of compressed data have important implications to the design of space science data systems, science applications, and data compression techniques. The sequential nature or data dependence between each of the sample values within a block of compressed data introduces an error multiplication or propagation factor which compounds the effects of communication errors. The data communication characteristics of the onboard data acquisition, storage, and telecommunication channels may influence the size of the compressed blocks and the frequency of included re-initialization points. The organization of the compressed data are continually changing depending on the entropy of the input data. This also results in a variable output rate from the instrument which may require buffering to interface with the spacecraft data system. On the ground, there exist key tradeoff issues associated with the distribution and management of the science data products when data compression techniques are applied in order to alleviate the constraints imposed by ground communication bandwidth and data storage capacity.

  18. Independent SCPS-TP development for fault-tolerant, end-to-end communication architectures

    NASA Astrophysics Data System (ADS)

    Edwards, E.; Lamorie, J.; Younghusband, D.; Brunet, C.; Hartman, L.

    2002-07-01

    A fully networked architecture provides for the distribution of computing elements, of all mission components, through the spacecraft. Each node is individually addressable through the network, and behaves as an independent entity. This level of communication also supports individualized Command and Data Handling (C&DH), as well as one-to-one transactions between spacecraft nodes and individual ground segment users. To be effective, fault-tolerance must be applied at the network data transport level, as well as the supporting layers below it. If the network provides fail-safe characteristics independent of the mission applications being executed, then developers need not build in their own systems to ensure network reliability. The Space Communications Protocol Standards (SCPS) were developed to provide robust communications in a space environment, while retaining compatibility with Internet data transport at the ground segment. Although SCPS is a standard of the Consultative Committee for Space Data Systems (CCSDS), the adoption of SCPS was initially delayed by US export regulations that prevented the distribution of reference code. This paper describes the development and test of a fully independent implementation of the SCSP Transport Protocol, SCPS-TP, which has been derived directly from the CCSDS specification. The performance of the protocol is described for a set of geostationary satellite tests, and these results will be compared with those derived from network simulation and laboratory emulation. The work is placed in the context of a comprehensive, fault-tolerant network that potentially surpasses the failsafe performance of a traditional spacecraft control system under similar circumstances.

  19. EQUIP: end-to-end quantification of uncertainty for impacts prediction

    NASA Astrophysics Data System (ADS)

    Morse, A. P.; Challinor, A. J.; Equip Consortium

    2010-12-01

    Inherent uncertainties in climate prediction present a serious challenge to attempts to assess future impacts and adaptation options. Such assessments are critical to any policy decisions regarding investment in resources to ensure human and environmental wellbeing in the face of environmental change and a growing population. Current methods for quantifying uncertainty in projections of climate and its impacts tend to focus first on taking full account of uncertainty, with a subsequent step assessing utility. We argue that a new approach is required, whereby climate and impacts models are used to develop risk-based prediction systems that focus on the information content of models and utility for decision-making. Preliminary steps in this direction are explored, principally using the example of climate-induced changes in crop yield. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. A focus on governing bio-physical processes across a number of crop models is used to characterise the robustness of the results. Further development of this approach relies on the development of decision-focussed techniques that analyse sources of uncertainty and assess and improve the information content of models of climate and its impacts. Such an approach is significantly different from tagging impacts models onto climate models. It implies substantial interaction with other organisations and stakeholders from development NGOs to the insurance sector and policy makers. These interactions should be aimed at ensuring that the principal lead-times, and formats, for the impact projections are those relevant to decision-making. The EQUIP project, and its associated open network of scientists, aims to develop the approach outlined above. The project is examining the cascade of uncertainty from climate to impacts by conducting integrated analyses of a range of sectors, principally crops, marine ecosystems, water management, heat waves and droughts. The research includes assessment of the information content of climate model projections, combination of climate models and data-driven models to support decisions, and evaluation of the quality of climate and impacts predictions.

  20. An end-to-end data system for the Gamma-Ray Observatory

    NASA Astrophysics Data System (ADS)

    Hrastar, J.

    A data system, which includes parts in the orbiting Gamma-Ray Observatory and in its associated ground system, has been designed to rapidly deliver autonomous, packeted data to the science users. Data autonomy means all of the data, including auxiliary data, necessary for processing is included in the data packet that leaves the spacecraft. The data packets leaving the spacecraft remain unopened until they reach the user. Handling the data on a packet rather than a byte level allows simpler and generic software. The data goes through the system more quickly. This in turn reduces cost.

  1. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing

    PubMed Central

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-01-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient’s genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion: an attacker, given the model and some demographic information about a patient, can predict the patient’s genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected. We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality. We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work. PMID:27077138

  2. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return missions, and approaching spacecraft in the vicinity of Mars, to demostrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out its own science investigations.

  3. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Bridenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return, missions, and approaching spacecraft in the vicinity of Mars, to demonstrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out is own science investigations.

  4. An end-to-end pointing budget approach to planetary observing systems with application to EOS

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.

    1993-01-01

    Previously published error budgets have focused on spacecraft error sources for pointing error and have tended to include only spacecraft pointing rather than the ultimate geolocation of each pixel of dam onto a well-defined spot on the surface of the Earth. A systematic approach to geolocation error budgeting, including all contributors in the geolocation process is presented. Its structure allows simultaneous expression of the needs of instrument teams as well as spacecraft design teams. It allows explicit acknowledgement of approximations made for on-board control as well as the ultimate geolocation accuracy achievable after ground processing and exploits the commonality inherent in the on-board and post-processing error budgets. It also includes the uncontrolled and unmeasured spacecraft jitter. This approach can be used to investigate the mission-wide benefit of a variety of design choices, (such as on-board sensing and ground correction of measurements contrasted with on-board correction of measurements). Additionally, in light of increasing accuracy requirements as sensor resolution improves, numerous non-spacecraft contributors to geolocation error are quantified.

  5. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process.The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.

  6. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells. PMID:23651286

  7. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  8. End-to-end Encryption for SMS Messages in the Health Care Domain.

    PubMed

    Hassinen, Marko; Laitinen, Pertti

    2005-01-01

    The health care domain has a high level of expectation on security and privacy of patient information. The security, privacy, and confidentiality issues are consistent all over the domain. Technical development and increasing use of mobile phones has led us to a situation in which SMS messages are used in the electronic interactions between health care professionals and patients. We will show that it is possible to send, receive and store text messages securely with a mobile phone with no additional hardware required. More importantly we will show that it is possible to obtain a reliable user authentication in systems using text message communication. Programming language Java is used for realization of our goals. This paper describes the general application structure, while details for the technical implementation and encryption methods are described in the referenced articles. We also propose some crucial areas where the implementation of encrypted SMS can solve previous lack of security. PMID:16160278

  9. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  10. Building an advanced wireless end-to-end emergency medical system.

    PubMed

    Saddik, Basema; El-Masri, Samir

    2011-01-01

    Effective communication in healthcare is important and especially critical in emergency situations. In this paper we propose a new comprehensive emergency system which will facilitate the communication process in emergency cases from ambulance dispatch to the patient's arrival and handover in the hospital. The proposed system has been designed to facilitate and computerise all the processes involved in an accident from finding the nearest ambulance through to accessing a patient's online health record which can assist in pre-hospital treatments. The proposed system will also locate the nearest hospital specialising in the patient's condition and will communicate patient identification to the emergency department. The components of the proposed system and the technologies used in building this system are outlined in this paper as well as the challenges expected and proposed solutions to these challenges. PMID:21893922

  11. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  12. Potential end-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1978-01-01

    Various communication systems were considered which are required to transmit both imaging and a typically error sensitive, class of data called general science/engineering (gse) over a Gaussian channel. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an Advanced Imaging Communication System (AICS) which exhibits the rather significant potential advantages of sophisticated data compression coupled with powerful yet practical channel coding.

  13. Integrating end-to-end encryption and authentication technology into broadband networks

    SciTech Connect

    Pierson, L.G.

    1995-11-01

    BISDN services will involve the integration of high speed data, voice, and video functionality delivered via technology similar to Asynchronous Transfer Mode (ATM) switching and SONET optical transmission systems. Customers of BISDN services may need a variety of data authenticity and privacy assurances, via Asynchronous Transfer Mode (ATM) services Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale for implementation at high speed. The incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. While there are many design issues associated with the serving of public keys for authenticated signaling and for establishment of session cryptovariables, this paper is concerned with the impact of encryption itself on such communications once the signaling and setup have been completed. Network security protections should be carefully matched to the threats against which protection is desired. Even after eliminating unnecessary protections, the remaining customer-required network security protections can impose severe performance penalties. These penalties (further discussed below) usually involve increased communication processing for authentication or encryption, increased error rate, increased communication delay, and decreased reliability/availability. Protection measures involving encryption should be carefully engineered so as to impose the least performance, reliability, and functionality penalties, while achieving the required security protection. To study these trade-offs, a prototype encryptor/decryptor was developed. This effort demonstrated the viability of implementing certain encryption techniques in high speed networks. The research prototype processes ATM cells in a SONET OC-3 payload. This paper describes the functionality, reliability, security, and performance design trade-offs investigated with the prototype.

  14. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  15. A Novel Vascular Coupling System for End-to-End Anastomosis.

    PubMed

    Li, Huizhong; Gale, Bruce K; Sant, Himanshu; Shea, Jill; David Bell, E; Agarwal, Jay

    2015-09-01

    Vascular anastomosis is common during reconstructive surgeries. Traditional hand-suturing techniques are time consuming, subject to human error, and require high technical expertise and complex instruments. Prior attempts to replace hand-suturing technique, including staples, ring-pin devices, cuffing devices, and clips, are either more cumbersome, are unable to maintain a tight seal, or do not work for both arteries and veins. To provide a more efficient and reliable vessel anastomosis, a metal-free vascular coupling system that can be used for both arteries and veins was designed, fabricated and tested. A set of corresponding instruments were developed to facilitate the anastomosis process. Evaluation of the anastomosis by scanning electron microscopy and magnetic resonance imaging, demonstrated that the installation process does not cause damage to the vessel intima and the vascular coupling system is not exposed to the vessel lumen. Mechanical testing results showed that vessels reconnected with the vascular coupling system could withstand 12.7 ± 2.2 N tensile force and have superior leak profiles (0.049 ± 0.015, 0.078 ± 0.016, 0.089 ± 0.008 mL/s at 160, 260, 360 mmHg, respectively) compared to hand sutured vessels (0.310 ± 0.014, 1.123 ± 0.033, 2.092 ± 0.072 mL/s at 160, 260, 360 mmHg, respectively). The anastomotic process was successfully demonstrated on both arteries and veins in cadaver pigs. PMID:26577362

  16. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    SciTech Connect

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-01-01

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  17. Building an End-to-end System for Long Term Soil Monitoring

    NASA Astrophysics Data System (ADS)

    Szlavecz, K.; Terzis, A.; Musaloiu-E., R.; Cogan, J.; Szalay, A.; Gray, J.

    2006-05-01

    We have developed and deployed an experimental soil monitoring system in an urban forest. Wireless sensor nodes collect data on soil temperature, soil moisture, air temperature, and light. Data are uploaded into a SQL Server database, where they are calibrated and reorganized into an OLAP data cube. The data are accessible on-line using a web services interface with various visual tools. Our prototype system of ten nodes has been live since Sep 2005, and in 5 months of operation over 6 million measurements have been collected. At a high level, our experiment was a success: we detected variations in soil condition corresponding to topography and external environmental parameters as expected. However, we encountered a number of challenging technical problems: need for low-level programming at multiple levels, calibration across space and time, and cross- reference of measurements with external sources. Based upon the experience with this system we are now deploying 200 mode nodes with close to a thousand sensors spread over multiple sites in the context of the Baltimore Ecosystem Study LTER. www

  18. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  19. An end-to-end data system for the Gamma-Ray Observatory

    NASA Technical Reports Server (NTRS)

    Hrastar, J.

    1983-01-01

    A data system, which includes parts in the orbiting Gamma-Ray Observatory and in its associated ground system, has been designed to rapidly deliver autonomous, packeted data to the science users. Data autonomy means all of the data, including auxiliary data, necessary for processing is included in the data packet that leaves the spacecraft. The data packets leaving the spacecraft remain unopened until they reach the user. Handling the data on a packet rather than a byte level allows simpler and generic software. The data goes through the system more quickly. This in turn reduces cost.

  20. Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS

    NASA Astrophysics Data System (ADS)

    Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.

    Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.

  1. Assessing Natural Product-Drug Interactions: An End-to-End Safety Framework.

    PubMed

    Roe, Amy L; Paine, Mary F; Gurley, Bill J; Brouwer, Kenneth R; Jordan, Scott; Griffiths, James C

    2016-04-01

    The use of natural products (NPs), including herbal medicines and other dietary supplements, by North Americans continues to increase across all age groups. This population has access to conventional medications, with significant polypharmacy observed in older adults. Thus, the safety of the interactions between multi-ingredient NPs and drugs is a topic of paramount importance. Considerations such as history of safe use, literature data from animal toxicity and human clinical studies, and NP constituent characterization would provide guidance on whether to assess NP-drug interactions experimentally. The literature is replete with reports of various NP extracts and constituents as potent inhibitors of drug metabolizing enzymes, and transporters. However, without standard methods for NP characterization or in vitro testing, extrapolating these reports to clinically-relevant NP-drug interactions is difficult. This lack of a clear definition of risk precludes clinicians and consumers from making informed decisions about the safety of taking NPs with conventional medications. A framework is needed that describes an integrated robust approach for assessing NP-drug interactions; and, translation of the data into formulation alterations, dose adjustment, labelling, and/or post-marketing surveillance strategies. A session was held at the 41st Annual Summer Meeting of the Toxicology Forum in Colorado Springs, CO, to highlight the challenges and critical components that should be included in a framework approach. PMID:26776752

  2. Assessing the Performance Limits of Internal Coronagraphs Through End-to-End Modeling

    NASA Technical Reports Server (NTRS)

    Krist, John E.; Belikov, Ruslan; Pueyo, Laurent; Mawet, Dimitri P.; Moody, Dwight; Trauger, John T.; Shaklan, Stuart B.

    2013-01-01

    As part of the NASA ROSES Technology Demonstrations for Exoplanet Missions (TDEM) program, we conducted a numerical modeling study of three internal coronagraphs (PIAA, vector vortex, hybrid bandlimited) to understand their behaviors in realistically-aberrated systems with wavefront control (deformable mirrors). This investigation consisted of two milestones: (1) develop wavefront propagation codes appropriate for each coronagraph that are accurate to 1% or better (compared to a reference algorithm) but are also time and memory efficient, and (2) use these codes to determine the wavefront control limits of each architecture. We discuss here how the milestones were met and identify some of the behaviors particular to each coronagraph. The codes developed in this study are being made available for community use. We discuss here results for the HBLC and VVC systems, with PIAA having been discussed in a previous proceeding.

  3. Science and Applications Space Platform (SASP) End-to-End Data System Study

    NASA Technical Reports Server (NTRS)

    Crawford, P. R.; Kasulka, L. H.

    1981-01-01

    The capability of present technology and the Tracking and Data Relay Satellite System (TDRSS) to accommodate Science and Applications Space Platforms (SASP) payload user's requirements, maximum service to the user through optimization of the SASP Onboard Command and Data Management System, and the ability and availability of new technology to accommodate the evolution of SASP payloads were assessed. Key technology items identified to accommodate payloads on a SASP were onboard storage devices, multiplexers, and onboard data processors. The primary driver is the limited access to TDRSS for single access channels due to sharing with all the low Earth orbit spacecraft plus shuttle. Advantages of onboard data processing include long term storage of processed data until TRDSS is accessible, thus reducing the loss of data, eliminating large data processing tasks at the ground stations, and providing a more timely access to the data.

  4. OTRA-THS MAC to reduce Power Outage Data Collection Latency in a smart meter network

    SciTech Connect

    Garlapati, Shravan K; Kuruganti, Phani Teja; Buehrer, Richard M; Reed, Jeffrey H

    2014-01-01

    The deployment of advanced metering infrastructure by the electric utilities poses unique communication challenges, particularly as the number of meters per aggregator increases. During a power outage, a smart meter tries to report it instantaneously to the electric utility. In a densely populated residential/industrial locality, it is possible that a large number of smart meters simultaneously try to get access to the communication network to report the power outage. If the number of smart meters is very high of the order of tens of thousands (metropolitan areas), the power outage data flooding can lead to Random Access CHannel (RACH) congestion. Several utilities are considering the use of cellular network for smart meter communications. In 3G/4G cellular networks, RACH congestion not only leads to collisions, retransmissions and increased RACH delays, but also has the potential to disrupt the dedicated traffic flow by increasing the interference levels (3G CDMA). In order to overcome this problem, in this paper we propose a Time Hierarchical Scheme (THS) that reduces the intensity of power outage data flooding and power outage reporting delay by 6/7th, and 17/18th when compared to their respective values without THS. Also, we propose an Optimum Transmission Rate Adaptive (OTRA) MAC to optimize the latency in power outage data collection. The analysis and simulation results presented in this paper show that both the OTRA and THS features of the proposed MAC results in a Power Outage Data Collection Latency (PODCL) that is 1/10th of the 4G LTE PODCL.

  5. CernVM: Minimal maintenance approach to virtualization

    NASA Astrophysics Data System (ADS)

    Buncic, Predrag; Aguado-Sanchez, Carlos; Blomer, Jakob; Harutyunyan, Artem

    2011-12-01

    CernVM is a virtual software appliance designed to support the development cycle and provide a runtime environment for the LHC experiments. It consists of three key components that differentiate it from more traditional virtual machines: a minimal Linux Operating System (OS), a specially tuned file system designed to deliver application software on demand, and contextualization tools that provide a means to easily customize and configure CernVM instances for different tasks and user communities. In this contribution we briefly describe the most important use cases for virtualization in High Energy Physics (HEP), CernVM key components and discuss how end-to-end systems corresponding to these use cases can be realized using CernVM.

  6. Distributed Power-Line Outage Detection Based on Wide Area Measurement System

    PubMed Central

    Zhao, Liang; Song, Wen-Zhan

    2014-01-01

    In modern power grids, the fast and reliable detection of power-line outages is an important functionality, which prevents cascading failures and facilitates an accurate state estimation to monitor the real-time conditions of the grids. However, most of the existing approaches for outage detection suffer from two drawbacks, namely: (i) high computational complexity; and (ii) relying on a centralized means of implementation. The high computational complexity limits the practical usage of outage detection only for the case of single-line or double-line outages. Meanwhile, the centralized means of implementation raises security and privacy issues. Considering these drawbacks, the present paper proposes a distributed framework, which carries out in-network information processing and only shares estimates on boundaries with the neighboring control areas. This novel framework relies on a convex-relaxed formulation of the line outage detection problem and leverages the alternating direction method of multipliers (ADMM) for its distributed solution. The proposed framework invokes a low computational complexity, requiring only linear and simple matrix-vector operations. We also extend this framework to incorporate the sparse property of the measurement matrix and employ the LSQRalgorithm to enable a warm start, which further accelerates the algorithm. Analysis and simulation tests validate the correctness and effectiveness of the proposed approaches. PMID:25051035

  7. Detecting Power Outages with the VIIRS DNB Images - potentials and challenges

    NASA Astrophysics Data System (ADS)

    Cao, C.; Uprety, S.; Shao, X.

    2012-12-01

    Power outages after a major storm or hurricane affect millions of people. The launch of the Suomi NPP with the VIIRS significantly enhances our capability to monitor and detect power outages on a daily basis with the Day Night Band (DNB) which outperforms the traditional OSL on DMSP satellites in both spatial and radiometric resolutions. This study explores the use of the DNB for detecting power outages in the Washington DC metropolitan area in June 2012, which was the largest non-hurricane power outage in history for the region with millions of people lost power, and state of emergency declared in some states such as Virginia. The DNB data were analyzed for the period one week before and after the storm. The light loss is estimated through image differencing techniques for spatial patterns, as well as total radiance and irradiance changes as a time series. The effects of cloud absorption and scattering are evaluated using the cloud masks from VIIRS products, and the long wave thermal infrared images are also used to assist the assessment. The results show that the DNB data are very useful for both spatial and radiometric detection of light loss, but also with some challenges due to clouds and the known terminator straylight effect of the instrument for the region during summer solstice. It is expected that further refinements in the methodology will significantly reduce the uncertainties. A VIIRS Data Robotics system is also being developed which will allow the routine detection of power outages for any given location worldwide.

  8. Status Report on the Development of Micro-Scheduling Software for the Advanced Outage Control Center Project

    SciTech Connect

    Shawn St. Germain; Kenneth Thomas; Ronald Farris; Jeffrey Joe

    2014-09-01

    The long-term viability of existing nuclear power plants (NPPs) in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet, refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are difficult to coordinate. Finding ways to improve refueling outage performance while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center project is a research and development (R&D) demonstration activity under the Light Water Reactor Sustainability (LWRS) Program. LWRS is a R&D program which works with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current NPPs. The Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, this INL R&D project is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report describes specific recent efforts to develop a capability called outage Micro-Scheduling. Micro-Scheduling is the ability to allocate and schedule outage support task resources on a sub-hour basis. Micro-Scheduling is the real-time fine-tuning of the outage schedule to react to the actual progress of the primary outage activities to ensure that support task resources are

  9. Minimal cosmography

    NASA Astrophysics Data System (ADS)

    Piazza, Federico; Schücker, Thomas

    2016-04-01

    The minimal requirement for cosmography—a non-dynamical description of the universe—is a prescription for calculating null geodesics, and time-like geodesics as a function of their proper time. In this paper, we consider the most general linear connection compatible with homogeneity and isotropy, but not necessarily with a metric. A light-cone structure is assigned by choosing a set of geodesics representing light rays. This defines a "scale factor" and a local notion of distance, as that travelled by light in a given proper time interval. We find that the velocities and relativistic energies of free-falling bodies decrease in time as a consequence of cosmic expansion, but at a rate that can be different than that dictated by the usual metric framework. By extrapolating this behavior to photons' redshift, we find that the latter is in principle independent of the "scale factor". Interestingly, redshift-distance relations and other standard geometric observables are modified in this extended framework, in a way that could be experimentally tested. An extremely tight constraint on the model, however, is represented by the blackbody-ness of the cosmic microwave background. Finally, as a check, we also consider the effects of a non-metric connection in a different set-up, namely, that of a static, spherically symmetric spacetime.

  10. Nuclear Safety Risk Management in Refueling Outage of Qinshan Nuclear Power Plant

    SciTech Connect

    Meijing Wu; Guozhang Shen

    2006-07-01

    The NPP is used to planning maintenance, in-service inspection, surveillance test, fuel handling and design modification in the refueling outage; the operator response capability will be reduced plus some of the plant systems out of service or loss of power at this time. Based on 8 times refueling outage experiences of the Qinshan NPP, this article provide some good practice and lesson learned for the nuclear safety risk management focus at four safety function areas of Residual Heat Removal Capability, Inventory Control, Power availability and Reactivity control. (authors)

  11. 10 CFR 501.191 - Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 4 2014-01-01 2014-01-01 false Use of natural gas or petroleum for certain unanticipated... Natural Gas or Petroleum for Emergency and Unanticipated Equipment Outage Purposes § 501.191 Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies defined in...

  12. 10 CFR 501.191 - Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Use of natural gas or petroleum for certain unanticipated... Natural Gas or Petroleum for Emergency and Unanticipated Equipment Outage Purposes § 501.191 Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies defined in...

  13. 10 CFR 501.191 - Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 4 2013-01-01 2013-01-01 false Use of natural gas or petroleum for certain unanticipated... Natural Gas or Petroleum for Emergency and Unanticipated Equipment Outage Purposes § 501.191 Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies defined in...

  14. 10 CFR 501.191 - Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 4 2012-01-01 2012-01-01 false Use of natural gas or petroleum for certain unanticipated... Natural Gas or Petroleum for Emergency and Unanticipated Equipment Outage Purposes § 501.191 Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies defined in...

  15. 10 CFR 501.191 - Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 4 2011-01-01 2011-01-01 false Use of natural gas or petroleum for certain unanticipated... Natural Gas or Petroleum for Emergency and Unanticipated Equipment Outage Purposes § 501.191 Use of natural gas or petroleum for certain unanticipated equipment outages and emergencies defined in...

  16. 76 FR 33686 - Proposed Extension of Part 4 of the Commission's Rules Regarding Outage Reporting to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-09

    ...The purpose of this document is to seek comment on a proposal to extend the Commission's communications outage reporting requirements to interconnected Voice over Internet Protocol (VoIP) service providers and broadband Internet Service Providers (ISPs). This action will help ensure that our current and future 9-1-1 systems are as reliable and resilient as possible and assist our Nation's......

  17. MIMO capacities and outage probabilities in spatially multiplexed optical transport systems.

    PubMed

    Winzer, Peter J; Foschini, Gerard J

    2011-08-15

    With wavelength-division multiplexing (WDM) rapidly nearing its scalability limits, space-division multiplexing (SDM) seems the only option to further scale the capacity of optical transport networks. In order for SDM systems to continue the WDM trend of reducing energy and cost per bit with system capacity, integration will be key to SDM. Since integration is likely to introduce non-negligible crosstalk between multiple parallel transmission paths, multiple-input multiple output (MIMO) signal processing techniques will have to be used. In this paper, we discuss MIMO capacities in optical SDM systems, including related outage considerations which are an important part in the design of such systems. In order to achieve the low-outage standards required for optical transport networks, SDM transponders should be capable of individually addressing, and preferably MIMO processing all modes supported by the optical SDM waveguide. We then discuss the effect of distributed optical noise in MIMO SDM systems and focus on the impact of mode-dependent loss (MDL) on system capacity and system outage. Through extensive numerical simulations, we extract scaling rules for mode-average and mode-dependent loss and show that MIMO SDM systems composed of up to 128 segments and supporting up to 128 modes can tolerate up to 1 dB of per-segment MDL at 90% of the system's full capacity at an outage probability of 10(-4). PMID:21935030

  18. 77 FR 63757 - Extension of the Commission's Rules Regarding Outage Reporting to Interconnected Voice Over...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-17

    ..., regarding Interconnected Voice over Internet Protocol (VoIP) outage reporting rules, published at 77 FR..., and 4.9 published at 77 FR 25088, April 27, 2012, are effective December 16, 2012. FOR FURTHER... Voice Over Internet Protocol Service Providers and Broadband Internet Service Providers AGENCY:...

  19. Application of Hybrid Geo-Spatially Granular Fragility Curves to Improve Power Outage Predictions

    SciTech Connect

    Fernandez, Steven J; Allen, Melissa R; Omitaomu, Olufemi A; Walker, Kimberly A

    2014-01-01

    Fragility curves depict the relationship between a weather variable (wind speed, gust speed, ice accumulation, precipitation rate) and the observed outages for a targeted infrastructure network. This paper describes an empirical study of the county by county distribution of power outages and one minute weather variables during Hurricane Irene with the objective of comparing 1) as built fragility curves (statistical approach) to engineering as designed (bottom up) fragility curves for skill in forecasting outages during future hurricanes; 2) county specific fragility curves to find examples of significant deviation from average behavior; and 3) the engineering practices of outlier counties to suggest future engineering studies of robustness. Outages in more than 90% of the impacted counties could be anticipated through an average or generic fragility curve. The remaining counties could be identified and handled as exceptions through geographic data sets. The counties with increased or decreased robustness were characterized by terrain more or less susceptible to persistent flooding in areas where above ground poles located their foundations. Land use characteristics of the area served by the power distribution system can suggest trends in the as built power grid vulnerabilities to extreme weather events that would be subjects for site specific studies.

  20. 47 CFR 4.5 - Definitions of outage, special offices and facilities, and 911 special facilities.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Definitions of outage, special offices and facilities, and 911 special facilities. 4.5 Section 4.5 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL DISRUPTIONS TO COMMUNICATIONS Reporting Requirements for Disruptions to Communications §...

  1. 47 CFR 4.5 - Definitions of outage, special offices and facilities, and 911 special facilities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Definitions of outage, special offices and facilities, and 911 special facilities. 4.5 Section 4.5 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL DISRUPTIONS TO COMMUNICATIONS Reporting Requirements for Disruptions to Communications §...

  2. Effects of the April 1st, 2014 GLONASS Outage on GNSS Receivers

    NASA Astrophysics Data System (ADS)

    Blume, F.; Berglund, H. T.; Romero, I.; D'Anastasio, E.

    2014-12-01

    The use of multi-constellation GNSS receivers has been assumed as a way to increase system integrity both by increased coverage during normal operations and failover redundancy in the event of a constellation failure. At approximately 21:00 UTC on April 1st the entire GLONASS constellation was disrupted as illegal ephemeris uploaded to each satellite took effect simultaneously. The outage continued for more than 10 hours. While ephemeris were incorrect, pseudoranges were correctly broadcast on both L1 and L2 and carrier phases were not affected; in the best case, GNSS receivers could be expected to continue to track all signals including GLONASS and at the worst to continue to track GPS and other constellations. It became clear to operators of the GeoNet network in New Zealand that the majority of their 79 GLONASS-enabled receivers experienced total tracking failures. Further detailed analysis of data from these and 315 additional GLONASS-enabled stations worldwide showed that receiver tracking behavior was affected for most receiver brands and models, both for GLONASS and GPS. Findings regarding the impacts of the GLONASS outage on receiver behavior will be highlighted. We use data recorded by GLONASS enabled global sites for the days during, preceding and following the outage to evaluate the impact of the outage on tracking and positioning performance. We observe that for some receiver types the onboard receiver autonomous integrity monitoring (RAIM) failed to ignore the incorrect messages, resulting in degraded GLONASS and GPS tracking and in some cases complete tracking failures and significant data loss. In addition, many of the receivers with clock steering enabled showed outliers in their receiver clock bias estimates that also coincided with the outage. Our results show in detail how different brands, configurations, and distributions of receivers were affected to varying extents, but no common factors are apparent. This event shows that many manufacturers

  3. Method for estimating power outages and restoration during natural and man-made events

    DOEpatents

    Omitaomu, Olufemi A.; Fernandez, Steven J.

    2016-01-05

    A method of modeling electric supply and demand with a data processor in combination with a recordable medium, and for estimating spatial distribution of electric power outages and affected populations. A geographic area is divided into cells to form a matrix. Within the matrix, supply cells are identified as containing electric substations and demand cells are identified as including electricity customers. Demand cells of the matrix are associated with the supply cells as a function of the capacity of each of the supply cells and the proximity and/or electricity demand of each of the demand cells. The method includes estimating a power outage by applying disaster event prediction information to the matrix, and estimating power restoration using the supply and demand cell information of the matrix and standardized and historical restoration information.

  4. Recent Performance of and Plasma Outage Studies with the SNS H- Source

    SciTech Connect

    Stockli, Martin P; Han, Baoxi; Murray Jr, S N; Pennisi, Terry R; Piller, Chip; Santana, Manuel; Welton, Robert F

    2016-01-01

    SNS ramps to higher power levels that can be sustained with high availability. The goal is 1.4 MW despite a compromised RFQ, which requires higher RF power than design levels to approach the nominal beam transmission. Unfortunately at higher power the RFQ often loses its thermal stability, a problem apparently enhanced by beam losses and high influxes of hydrogen. Delivering as much H- beam as possible with the least amount of hydrogen led to plasma outages. The root cause is the dense 1-ms long ~55-kW 2-MHz plasma pulses reflecting ~90% of the continuous ~300W, 13-MHz power, which was mitigated with a 4-ms filter for the reflected power signal and an outage resistant, slightly-detuned 13-MHz match. Lowering the H2 also increased the H- beam current to ~55 mA, and increased the transmission by ~7%.

  5. Recent performance of and plasma outage studies with the SNS H- source

    NASA Astrophysics Data System (ADS)

    Stockli, M. P.; Han, B.; Murray, S. N.; Pennisi, T. R.; Piller, C.; Santana, M.; Welton, R.

    2016-02-01

    Spallation Neutron Source ramps to higher power levels that can be sustained with high availability. The goal is 1.4 MW despite a compromised radio frequency quadrupole (RFQ), which requires higher radio frequency power than design levels to approach the nominal beam transmission. Unfortunately at higher power the RFQ often loses its thermal stability, a problem apparently enhanced by beam losses and high influxes of hydrogen. Delivering as much H- beam as possible with the least amount of hydrogen led to plasma outages. The root cause is the dense 1-ms long ˜55-kW 2-MHz plasma pulses reflecting ˜90% of the continuous ˜300 W, 13-MHz power, which was mitigated with a 4-ms filter for the reflected power signal and an outage resistant, slightly detuned 13-MHz match. Lowering the H2 gas also increased the H- beam current to ˜55 mA and increased the RFQ transmission by ˜7% (relative).

  6. Exact Outage Probability of Cognitive Underlay DF Relay Networks with Best Relay Selection

    NASA Astrophysics Data System (ADS)

    Bao, Vo Nguyen Quoc; Duong, Trung Quang

    In this letter, we address the performance analysis of underlay selective decode-and-forward (DF) relay networks in Rayleigh fading channels with non-necessarily identical fading parameters. In particular, a novel result on the outage probability of the considered system is presented. Monte Carlo simulations are performed to verify the correctness of our exact closed-form expression. Our proposed analysis can be adopted for various underlay spectrum sharing applications of cognitive DF relay networks.

  7. Use of VIIRS DNB Data to Monitor Power Outages and Restoration for Significant Weather Events

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary; Molthan, Andrew

    2008-01-01

    NASA fs Short-term Prediction Research and Transition (SPoRT) project operates from NASA's Marshall Space Flight Center in Huntsville, Alabama. The team provides unique satellite data to the National Weather Service (NWS) and other agencies and organizations for weather analysis. While much of its work is focused on improving short-term weather forecasting, the SPoRT team supported damage assessment and response to Hurricane Superstorm Sandy by providing imagery that highlighted regions without power. The team used data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the Suomi National Polar-orbiting Partnership (Suomi NPP) satellite. The VIIRS low-light sensor, known as the day-night-band (DNB), can detect nighttime light from wildfires, urban and rural communities, and other human activity which emits light. It can also detect moonlight reflected from clouds and surface features. Using real time VIIRS data collected by our collaborative partner at the Space Science and Engineering Center of the University of Wisconsin, the SPoRT team created composite imagery to help detect power outages and restoration. This blackout imagery allowed emergency response teams from a variety of agencies to better plan and marshal resources for recovery efforts. The blackout product identified large-scale outages, offering a comprehensive perspective beyond a patchwork GIS mapping of outages that utility companies provide based on customer complaints. To support the relief efforts, the team provided its imagery to the USGS data portal, which the Federal Emergency Management Agency (FEMA) and other agencies used in their relief efforts. The team fs product helped FEMA, the U.S. Army Corps of Engineers, and U.S. Army monitor regions without power as part of their disaster response activities. Disaster responders used the images to identify possible outages and effectively distribute relief resources. An enhanced product is being developed and integrated into a web

  8. A novel fusion methodology to bridge GPS outages for land vehicle positioning

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Li, Xu; Song, Xiang; Li, Bin; Song, Xianghui; Xu, Qimin

    2015-07-01

    Many intelligent transportation system applications require accurate, reliable, and continuous vehicle position information whether in open-sky environments or in Global Positioning System (GPS) denied environments. However, there remains a challenging task for land vehicles to achieve such positioning performance using low-cost sensors, especially microelectromechanical system (MEMS) sensors. In this paper, a novel and cost-effective fusion methodology to bridge GPS outages is proposed and applied in the Inertial Navigation System (INS)/GPS/ compass integrated positioning system. In the implementation of the proposed methodology, a key data preprocessing algorithm is first developed to eliminate the noise in inertial sensors in order to provide more accurate information for subsequent modeling. Then, a novel hybrid strategy incorporating the designed autoregressive model (AR model)-based forward estimator (ARFE) with Kalman filter (KF) is presented to predict the INS position errors during GPS outages. To verify the feasibility and effectiveness of the proposed methodology, real road tests with various scenarios were performed. The proposed methodology illustrates significant improvement in positioning accuracy during GPS outages.

  9. Lights out: Impact of the August 2003 power outage on mortality in New York, NY

    PubMed Central

    Anderson, G. Brooke; Bell, Michelle L.

    2012-01-01

    Background Little is known about how power outages affect health. We investigated mortality effects of the largest US blackout to date, August 14–15, 2003 in New York, NY. Methods We estimated mortality risk in New York, NY, using a generalized linear model with data from 1987–2005. We incorporated possible confounders, including weather and long-term and seasonal mortality trends. Results During the blackout, mortality increased for accidental deaths (122% [95% confidence interval = 28%–287%]) and non-accidental (i.e., disease-related) deaths (25% [12%–41%]), resulting in approximately 90 excess deaths. Increased mortality was not from deaths being advanced by a few days; rather, mortality risk remained slightly elevated through August 2003. Discussion To our knowledge, this is the first analysis of power outages and non-accidental mortality. Understanding the impact of power outages on human health is relevant, given that increased energy demand and climate change are likely to put added strain on power grids. PMID:22252408

  10. Esophagectomy - minimally invasive

    MedlinePlus

    Minimally invasive esophagectomy; Robotic esophagectomy; Removal of the esophagus - minimally invasive; Achalasia - esophagectomy; Barrett esophagus - esophagectomy; Esophageal cancer - esophagectomy - laparoscopic; Cancer of the ...

  11. Power Outages

    MedlinePlus

    ... surge protectors. If you are considering purchasing a generator for your home, consult an electrician or engineer before purchasing and installing. Only use generators away from your home and NEVER run a ...

  12. Methodology to predict the number of forced outages due to creep failure

    SciTech Connect

    Palermo, J.V. Jr.

    1996-12-31

    All alloy metals at a temperature above 950 degrees Fahrenheit experience creep damage. Creep failures in boiler tubes usually begin after 25 to 40 years of operation. Since creep damage is irreversible, the only remedy is to replace the tube sections. By predicting the number of failures per year, the utility can make the best economic decision concerning tube replacement. This paper describes a methodology to calculate the number of forced outages per yea due to creep failures. This methodology is particularly useful to utilities that have boilers that have at least 25 years of operation.

  13. Feasibility Study for the K-Area Bingham Pump Outage Pit (643-1G)

    SciTech Connect

    Palmer, E.R.

    1997-05-01

    The K-Area Bingham Pump Outage Pit (KBPOP) is one of four BPOP areas at Savannah River Site (SRS), collectively referred to as the BPOP waste unit group. This Feasibility Study (FS) of Remedial Alternatives serves as the lead FS for the BPOP waste unit group. This section identifies the purpose and scope of the FS and presents site background information summarized from the Final Remedial Investigation Report with Baseline Risk Assessment (RI/BRA) WSRC-RP- 95-1555, Rev. 1.2 (WSRC 1997).

  14. Economic costs of electrical system instability and power outages caused by snakes on the Island of Guam

    USGS Publications Warehouse

    Fritts, T.H.

    2002-01-01

    The Brown Tree Snake, Boiga irregularis, is an introduced species on Guam where it causes frequent electrical power outages. The snake's high abundance, its propensity for climbing, and use of disturbed habitats all contribute to interruption of Guam's electrical service and the activities that depend on electrical power. Snakes have caused more than 1600 power outages in the 20-yr period of 1978-1997 and most recently nearly 200 outages per year. Single outages spanning the entire island and lasting 8 or more hours are estimated to cost in excess of $3,000,000 in lost productivity, but the costs of outages that involve only parts of the island or those of shorter durations are more difficult to quantify. Costs to the island's economy have exceeded $4.5 M per year over a 7-yr period without considering repair costs, damage to electrical equipment, and lost revenues. Snakes pose the greatest problem on high voltage transmission lines, on transformers, and inside electrical substations.

  15. Water outage increases the risk of gastroenteritis and eyes and skin diseases

    PubMed Central

    2011-01-01

    Background The present study used insurance claims data to investigate infections associated with short-term water outage because of constructions or pipe breaks. Methods The present study used medical claims of one million insured persons for 2004-2006. We estimated incidences of gastroenteritis and eye and skin complaints for 10 days before, during, and after 10 days of water supply restriction for outpatient visits and for emergency and in-patient care combined. Results There was an increase in medical services for these complaints in outpatient visits because of water outages. Poisson regression analyses showed that increased risks of medical services were significant for gastroenteritis (relative risk [RR] 1.31, 95% confidence interval [CI] 1.26-1.37), skin disease (RR 1.36, 95% CI 1.30-1.42), and eye disease patients (RR 1.34, 95% CI 1.26-1.44). Similar risks were observed during 10-day lag periods. Compared with those in cool days, risks of medical services are higher when average daily temperature is above 30°C for gastroenteritis (RR 12.1, 95% CI 6.17-23.7), skin diseases (RR 4.48, 95% CI 2.29-8.78), and eye diseases (RR 40.3, 95% CI 7.23-224). Conclusion We suggest promoting personal hygiene education during water supply shortages, particularly during the warm months. PMID:21943080

  16. Analytical Tools to Predict Distribution Outage Restoration Load. Final Project Report.

    SciTech Connect

    Law, John

    1994-11-14

    The main activity of this project has been twofold: (1) development of a computer model to predict CLPU(Cold Load Pickup) and (2) development of a field measurement and analysis method to obtain the input parameters of the CLPU model. The field measurement and analysis method is called the Step-Voltage-Test (STEPV). The Kootenai Electric Cooperative Appleway 51 feeder in Coeur d`Alene was selected for analysis in this project and STEPV tests were performed in winters of 92 and 93. The STEPV data was analyzed (method and results presented within this report) to obtain the Appleway 51 feeder parameters for prediction by the CLPU model. One only CLPU record was obtained in winter 1994. Unfortunately, the actual CLPU was not dramatic (short outage and moderate temperature) and did not display cyclic restoration current. A predicted Appleway 51 feeder CLPU was generated using the parameters obtained via the STEPV measurement/analysis/algorithm method at the same ambient temperature and outage duration as the measured actual CLPU. The predicted CLPU corresponds reasonably well with the single actual CLPU data obtained in winter 1994 on the Appleway 51 feeder.

  17. Exploiting Outage and Error Probability of Cooperative Incremental Relaying in Underwater Wireless Sensor Networks.

    PubMed

    Nasir, Hina; Javaid, Nadeem; Sher, Muhammad; Qasim, Umar; Khan, Zahoor Ali; Alrajeh, Nabil; Niaz, Iftikhar Azim

    2016-01-01

    This paper embeds a bi-fold contribution for Underwater Wireless Sensor Networks (UWSNs); performance analysis of incremental relaying in terms of outage and error probability, and based on the analysis proposition of two new cooperative routing protocols. Subject to the first contribution, a three step procedure is carried out; a system model is presented, the number of available relays are determined, and based on cooperative incremental retransmission methodology, closed-form expressions for outage and error probability are derived. Subject to the second contribution, Adaptive Cooperation in Energy (ACE) efficient depth based routing and Enhanced-ACE (E-ACE) are presented. In the proposed model, feedback mechanism indicates success or failure of data transmission. If direct transmission is successful, there is no need for relaying by cooperative relay nodes. In case of failure, all the available relays retransmit the data one by one till the desired signal quality is achieved at destination. Simulation results show that the ACE and E-ACE significantly improves network performance, i.e., throughput, when compared with other incremental relaying protocols like Cooperative Automatic Repeat reQuest (CARQ). E-ACE and ACE achieve 69% and 63% more throughput respectively as compared to CARQ in hard underwater environment. PMID:27420061

  18. Recent performance of and plasma outage studies with the SNS H⁻ source.

    PubMed

    Stockli, M P; Han, B; Murray, S N; Pennisi, T R; Piller, C; Santana, M; Welton, R

    2016-02-01

    Spallation Neutron Source ramps to higher power levels that can be sustained with high availability. The goal is 1.4 MW despite a compromised radio frequency quadrupole (RFQ), which requires higher radio frequency power than design levels to approach the nominal beam transmission. Unfortunately at higher power the RFQ often loses its thermal stability, a problem apparently enhanced by beam losses and high influxes of hydrogen. Delivering as much H(-) beam as possible with the least amount of hydrogen led to plasma outages. The root cause is the dense 1-ms long ∼55-kW 2-MHz plasma pulses reflecting ∼90% of the continuous ∼300 W, 13-MHz power, which was mitigated with a 4-ms filter for the reflected power signal and an outage resistant, slightly detuned 13-MHz match. Lowering the H2 gas also increased the H(-) beam current to ∼55 mA and increased the RFQ transmission by ∼7% (relative). PMID:26932022

  19. Exploiting Outage and Error Probability of Cooperative Incremental Relaying in Underwater Wireless Sensor Networks

    PubMed Central

    Nasir, Hina; Javaid, Nadeem; Sher, Muhammad; Qasim, Umar; Khan, Zahoor Ali; Alrajeh, Nabil; Niaz, Iftikhar Azim

    2016-01-01

    This paper embeds a bi-fold contribution for Underwater Wireless Sensor Networks (UWSNs); performance analysis of incremental relaying in terms of outage and error probability, and based on the analysis proposition of two new cooperative routing protocols. Subject to the first contribution, a three step procedure is carried out; a system model is presented, the number of available relays are determined, and based on cooperative incremental retransmission methodology, closed-form expressions for outage and error probability are derived. Subject to the second contribution, Adaptive Cooperation in Energy (ACE) efficient depth based routing and Enhanced-ACE (E-ACE) are presented. In the proposed model, feedback mechanism indicates success or failure of data transmission. If direct transmission is successful, there is no need for relaying by cooperative relay nodes. In case of failure, all the available relays retransmit the data one by one till the desired signal quality is achieved at destination. Simulation results show that the ACE and E-ACE significantly improves network performance, i.e., throughput, when compared with other incremental relaying protocols like Cooperative Automatic Repeat reQuest (CARQ). E-ACE and ACE achieve 69% and 63% more throughput respectively as compared to CARQ in hard underwater environment. PMID:27420061

  20. Minimal change disease

    MedlinePlus

    Minimal change nephrotic syndrome; Nil disease; Lipoid nephrosis; Idiopathic nephrotic syndrome of childhood ... which filter blood and produce urine. In minimal change disease, there is damage to the glomeruli. These ...

  1. Minimal change disease

    MedlinePlus

    ... seen under a very powerful microscope called an electron microscope. Minimal change disease is the most common ... biopsy and examination of the tissue with an electron microscope can show signs of minimal change disease.

  2. ORNL IntelligentFreight Initiative:Enhanced End-to-End Supply Chain Visibility of Security Sensitive Hazardous Materials

    SciTech Connect

    Walker, Randy M.; Shankar, Mallikarjun; Gorman, Bryan L.

    2009-01-01

    In the post September 11, 2001 (9/11) world the federal government has increased its focus on the manufacturing, distributing, warehousing, and transporting of hazardous materials. In 2002, Congress mandated that the Transportation Security Agency (TSA) designate a subset of hazardous materials that could pose a threat to the American public when transported in sufficiently large quantities. This subset of hazardous materials, which could be weaponized or subjected to a nefarious terrorist act, was designated as Security Sensitive Hazardous Materials (SSHM). Radioactive materials (RAM) were of special concern because actionable intelligence had revealed that Al Qaeda desired to develop a homemade nuclear device or a dirty bomb to use against the United States (US) or its allies.1 Because of this clear and present danger, it is today a national priority to develop and deploy technologies that will provide for visibility and real-time exception notification of SSHM and Radioactive Materials in Quantities of Concern (RAMQC) in international commerce. Over the past eight years Oak Ridge National Laboratory (ORNL) has been developing, implementing, and deploying sensor-based technologies to enhance supply chain visibility. ORNL s research into creating a model for shipments, known as IntelligentFreight, has investigated sensors and sensor integration methods at numerous testbeds throughout the national supply chain. As a result of our research, ORNL believes that most of the information needed by supply chain partners to provide shipment visibility and exceptions-based reporting already exists but is trapped in numerous proprietary or agency-centric databases.

  3. End-to-end 9-D polarized bunch transport in eRHIC energy-recovery recirculator, some aspects

    SciTech Connect

    Meot, F.; Meot, F.; Brooks, S.; Ptitsyn, V.; Trbojevic, D.; Tsoupas, N.

    2015-05-03

    This paper is a brief overview of some of the numerous beam and spin dynamics investigations undertaken in the framework of the design of the FFAG based electron energy recovery re-circulator ring of the eRHIC electron-ion collider project

  4. Caching of a chameleon segment facilitates folding of a protein with end-to-end beta-sheet.

    PubMed

    Mohanty, Sandipan; Hansmann, Ulrich H E

    2008-11-27

    We report results from all-atom simulations of a 49-residue C-terminal fragment of TOP7 in implicit solvent. Using parallel tempering simulations with high statistics, we probe the thermodynamic properties of the protein over a large range of temperatures and evaluate its free energy landscape at room temperature. Our results confirm that the protein folds by a caching mechanism that relies on a chameleon segment. This mechanism differs from the one seen in high-temperature unfolding simulations. Finally, we discuss a possible mechanism for dimerization of the protein. PMID:18956901

  5. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    NASA Astrophysics Data System (ADS)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  6. Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.

    1998-01-01

    The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.

  7. Parameterizations of truncated food web models from the perspective of an end-to-end model approach

    NASA Astrophysics Data System (ADS)

    Fennel, Wolfgang

    2009-02-01

    Modeling of marine ecosystems is broadly divided into two branches: biogeochemical processes and fish production. The biogeochemical models see the fish only implicitly by mortality rates, while fish production models see the lower food web basically through prescribed food, e.g., copepod biomass. The skill assessment of ecological models, which are usually truncated biogeochemical models, also involves the question of how the effects of the missing higher food web are parameterized. This paper contributes to the goal of bridging biogeochemical models and fish-production models by employing a recently developed coupled NPZDF-model, Fennel [Fennel, W., 2007. Towards bridging biogeochemical and fish production models. Journal of Marine Systems, doi:10.1016/j.jmarsys.2007.06.008]. Here we study parameterizations of truncated NPZD-models from the viewpoint of a complete model. The effects of the higher food web on the cycling of the state variables in a truncated NPZD-model cannot be unambiguously imitated. For example, one can mimic effects of fishery by export fluxes of one of the state variables. It is shown that the mass fluxes between the lower and upper part of the full model food web are significantly smaller than the fluxes within the NPZD-model. However, over longer time scales, relatively small changes can accumulate and eventually become important.

  8. End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Kelly, Patrick; Rickman, Douglas; Smith, Eric

    1991-01-01

    The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.

  9. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  10. The NOAO Data Products Program: Developing an End-to-End Data Management System in Support of the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Smith, R. C.; Boroson, T.; Seaman, R.

    2007-10-01

    The NOAO Data Products Program (DPP) is responsible for the development and operation of the data management system for NOAO and affiliated observatories, and for the scientific support of users accessing our data holdings and using our tools and services. At the core of this mission is the capture of data from instruments at these observatories and the delivery of that content to both the Principle Investigators (PIs) who proposed for the observations and, after an appropriate proprietary period, to users worldwide who are interested in using the data for their own (often very different) scientific projects. However, delivery of raw and/or reduced images to users only scratches the surface of the extensive potential which the international Virtual Observatory (VO) initiative has to offer. By designing the whole NOAO/DPP program around not only VO standards, but more importantly around VO principles, the program becomes not an exercise in data management and NOAO user support, but rather a VO-centric program which serves the growing world-wide VO community. It is this more global aspect that drives NOAO/DPP planning, as well as more specifically the design, development, and operations of the various components of our system. In the following sections we discuss these components and how they work together to form our VO-centric program.

  11. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    PubMed

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility. PMID:22453644

  12. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data autoloading capability of Django and a TEDS processing script to populate the database with the incoming data. The Python WaterOneFlow Web Services developed by the Texas Water Development Board will be used to publish the data. The software suite is being tested on the Raspberry Pi as end node and a laptop PC as the base station in a wireless setting.

  13. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    NASA Astrophysics Data System (ADS)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  14. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    SciTech Connect

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  15. Evaluation of allowed outage times (AOTs) from a risk and reliability standpoint

    SciTech Connect

    Vesely, W.E. )

    1989-08-01

    This report describes the basic risks which are associated with allowed outage times (AOTs), defines strategies for selecting the risks to be quantified, and describes how the risks can be quantified. The report furthermore describes criteria considerations in determining the acceptability of calculated AOT risks, and discusses the merits of relative risk criteria versus absolute risk criteria. The detailed evaluations which are involved in calculating AOT risks, including uncertainty considerations are also discussed. The report also describes the proper ways that risks from multiple AOTs should be considered so that risks are properly accumulated from proposed multiple AOT changes, but are not double-counted. Generally, average AOT risks which include the frequency of occurrence of the AOT need to be accumulated but single downtime risks don't since they apply to individual AOTs. 8 refs., 22 tabs.

  16. Analysis of 12 electric power system outages/disturbances impacting the Florida Peninsula

    SciTech Connect

    1980-12-01

    Between January 3 and August 3, 1979, there were 12 occasions on which electric power was curtailed or public appeals were made to customers to reduce their load in Peninsular Florida due to bulk electric power supply problems. The Economic Regulatory Administration (ERA) of the US DOE, pursuant to its electric power supply adequacy and reliability responsibilities, initiated a twofold analysis of the bulk power supply situation in Florida. The first phase of the analysis evaluated the technical and engineering aspects of the overall Florida bulk power supply system with special attention given to the City of Jacksonville's electric system. The second phase evaluated the socioeconomic impacts of the bulk power supply outages on residential customers in Jacksonville. The 12 disturbances are described and a comparative analysis of the causes is presented. (LCL)

  17. A A field test for extremity dose assessment during outages at Korean nuclear power plants.

    PubMed

    Kim, Hee Geun; Kong, Tae Young

    2013-05-01

    During maintenance on the water chamber of a steam generator, the pressuriser heater and the pressure tube feeder in nuclear power plants, workers are likely to receive high radiation doses due to the severe workplace conditions. In particular, it is expected that workers' hands would receive the highest radiation doses because of their contact with the radioactive materials. In this study, field tests for extremity dose assessments in radiation workers undertaking contact tasks with high radiation doses were conducted during outages at pressurised water reactors and pressurised heavy water reactors in Korea. In the test, the radiation workers were required to wear additional thermoluminescent dosemeters (TLDs) on their backs and wrists and an extremity dosemeter on the finger, as well as a main TLD on the chest while performing the maintenance tasks. PMID:23091221

  18. A low-cost evolutionary algorithm for the unit commitment problem considering probabilistic unit outages

    NASA Astrophysics Data System (ADS)

    Asouti, V. G.; Giannakoglou, K. C.

    2012-07-01

    This article presents a solution method to the unit commitment problem with probabilistic unit failures and repairs, which is based on evolutionary algorithms and Monte Carlo simulations. Regarding the latter, thousands of availability-unavailability trial time patterns along the scheduling horizon are generated. The objective function to be minimised is the expected total operating cost, computed after adapting any candidate solution, i.e. any series of generating/non-generating (ON/OFF) unit states, to the availability-unavailability patterns and performing evaluations by considering fuel, start-up and shutdown costs as well as the cost for buying electricity from external resources, if necessary. The proposed method introduces a new efficient chromosome representation: the decision variables are integer IDs corresponding to the binary-to-decimal converted ON/OFF (1/0) scenarios that cover the demand in each hour. In contrast to previous methods using binary strings as chromosomes, the new chromosome must be penalised only if any of the constraints regarding start-up, shutdown and ramp times cannot be met, chromosome repair is avoided and, consequently, the dispatch problems are solved once in the preparatory phase instead of during the evolution. For all these reasons, with or without probabilistic outages, the proposed algorithm has much lower CPU cost. In addition, if probabilistic outages are taken into account, a hierarchical evaluation scheme offers extra noticeable gain in CPU cost: the population members are approximately pre-evaluated using a small 'representative' set of the Monte Carlo simulations and only a few top population members undergo evaluations through the full Monte Carlo simulations. The hierarchical scheme makes the proposed method about one order of magnitude faster than its conventional counterpart.

  19. Are Older Adults Prepared to Ensure Food Safety during Extended Power Outages and Other Emergencies?: Findings from a National Survey

    ERIC Educational Resources Information Center

    Kosa, Katherine M.; Cates, Sheryl C.; Karns, Shawn; Godwin, Sandria L.; Coppings, Richard J.

    2012-01-01

    Natural disasters and other emergencies can cause an increased risk of foodborne illness. We conducted a nationally representative survey to understand consumers' knowledge and use of recommended practices during/after extended power outages and other emergencies. Because older adults are at an increased risk for foodborne illness, this paper…

  20. Emergency preparedness for power outages and wi-fi loss: tips for students and educators of online courses.

    PubMed

    Heithaus, Teresa

    2015-01-01

    Severe weather can impact online education due to a loss of power and Internet access that can last hours or weeks. Planning for such losses is essential to enable participation in the online classroom. This article discusses measures that can be used to maintain an online presence in the event of a power outage or loss of Wi-Fi. PMID:25647316

  1. Olkiluoto 1 and 2 - Plant efficiency improvement and lifetime extension-project (PELE) implemented during outages 2010 and 2011

    SciTech Connect

    Kosonen, M.; Hakola, M.

    2012-07-01

    Teollisuuden Voima Oyj (TVO) is a non-listed public company founded in 1969 to produce electricity for its stakeholders. TVO is the operator of the Olkiluoto nuclear power plant. TVO follows the principle of continuous improvement in the operation and maintenance of the Olkiluoto plant units. The PELE project (Plant Efficiency Improvement and Lifetime Extension), mainly completed during the annual outages in 2010 and 2011, and forms one part of the systematic development of Olkiluoto units. TVO maintains a long-term development program that aims at systematically modernizing the plant unit systems and equipment based on the latest technology. According to the program, the Olkiluoto 1 and Olkiluoto 2 plant units are constantly renovated with the intention of keeping them safe and reliable, The aim of the modernization projects is to improve the safety, reliability, and performance of the plant units. PELE project at Olkiluoto 1 was done in 2010 and at Olkiluoto 2 in 2011. The outage length of Olkiluoto 1 was 26 d 12 h 4 min and Olkiluoto 2 outage length was 28 d 23 h 46 min. (Normal service-outage is about 14 days including refueling and refueling-outage length is about seven days. See figure 1) The PELE project consisted of several single projects collected into one for coordinated project management. Some of the main projects were as follows: - Low pressure turbines: rotor, stator vane, casing and turbine instrumentation replacement. - Replacement of Condenser Cooling Water (later called seawater pumps) pumps - Replacement of inner isolation valves on the main steam lines. - Generator and the generator cooling system replacement. - Low voltage switchgear replacement. This project will continue during future outages. PELE was a success. 100 TVO employees and 1500 subcontractor employees participated in the project. The execution of the PELE projects went extremely well during the outages. The replacement of the low pressure turbines and seawater pumps improved the

  2. Minimally Invasive Valve Surgery

    PubMed Central

    Pope, Nicolas H.; Ailawadi, Gorav

    2014-01-01

    Cardiac valve surgery is life saving for many patients. The advent of minimally invasive surgical techniques has historically allowed for improvement in both post-operative convalescence and important clinical outcomes. The development of minimally invasive cardiac valve repair and replacement surgery over the past decade is poised to revolutionize the care of cardiac valve patients. Here, we present a review of the history and current trends in minimally invasive aortic and mitral valve repair and replacement, including the development of sutureless bioprosthetic valves. PMID:24797148

  3. Prostate resection - minimally invasive

    MedlinePlus

    ... are: Erection problems (impotence) No symptom improvement Passing semen back into your bladder instead of out through ... Whelan JP, Goeree L. Systematic review and meta-analysis of transurethral resection of the prostate versus minimally ...

  4. Minimizing Shortness of Breath

    MedlinePlus

    ... Top Doctors in the Nation Departments & Divisions Home Health Insights Stress & Relaxation Breathing and Relaxation Minimizing Shortness of Breath ... Management Assess Your Stress Coping Strategies Identifying ... & Programs Health Insights Doctors & Departments Research & Science Education & Training Make ...

  5. Minimally invasive hip replacement

    MedlinePlus

    ... Smits SA, Swinford RR, Bahamonde RE. A randomized, prospective study of 3 minimally invasive surgical approaches in total hip arthroplasty: comprehensive gait analysis. J Arthroplasty . 2008;23:68-73. PMID: 18722305 ...

  6. Minimal Orderings Revisited

    SciTech Connect

    Peyton, B.W.

    1999-07-01

    When minimum orderings proved too difficult to deal with, Rose, Tarjan, and Leuker instead studied minimal orderings and how to compute them (Algorithmic aspects of vertex elimination on graphs, SIAM J. Comput., 5:266-283, 1976). This paper introduces an algorithm that is capable of computing much better minimal orderings much more efficiently than the algorithm in Rose et al. The new insight is a way to use certain structures and concepts from modern sparse Cholesky solvers to re-express one of the basic results in Rose et al. The new algorithm begins with any initial ordering and then refines it until a minimal ordering is obtained. it is simple to obtain high-quality low-cost minimal orderings by using fill-reducing heuristic orderings as initial orderings for the algorithm. We examine several such initial orderings in some detail.

  7. Minimalism. Clip and Save.

    ERIC Educational Resources Information Center

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  8. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect

    Mankamo, T.; Kim, I.S.; Samanta, P.K.

    1992-12-31

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  9. Risk-based evaluation of Allowed Outage Times (AOTs) considering risk of shutdown

    SciTech Connect

    Mankamo, T. ); Kim, I.S.; Samanta, P.K. )

    1992-01-01

    When safety systems fail during power operation, Technical Specifications (TS) usually limit the repair within Allowed Outage Time (AOT). If the repair cannot be completed within the AOT, or no AOT is allowed, the plant is required to be shut down for the repair. However, if the capability to remove decay heat is degraded, shutting down the plant with the need to operate the affected decay-heat removal systems may impose a substantial risk compared to continued power operation over a usual repair time. Thus, defining a proper AOT in such situations can be considered as a risk-comparison between the repair in frill power state with a temporarily increased level of risk, and the altemative of shutting down the plant for the repair in zero power state with a specific associated risk. The methodology of the risk-comparison approach, with a due consideration of the shutdown risk, has been further developed and applied to the AOT considerations of residual heat removal and standby service water systems of a boiling water reactor (BWR) plant. Based on the completed work, several improvements to the TS requirements for the systems studied can be suggested.

  10. Method for Evaluation of Outage Probability on Random Access Channel in Mobile Communication Systems

    NASA Astrophysics Data System (ADS)

    Kollár, Martin

    2012-05-01

    In order to access the cell in all mobile communication technologies a so called random-access procedure is used. For example in GSM this is represented by sending the CHANNEL REQUEST message from Mobile Station (MS) to Base Transceiver Station (BTS) which is consequently forwarded as an CHANNEL REQUIRED message to the Base Station Controller (BSC). If the BTS decodes some noise on the Random Access Channel (RACH) as random access by mistake (so- called ‘phantom RACH') then it is a question of pure coincidence which èstablishment cause’ the BTS thinks to have recognized. A typical invalid channel access request or phantom RACH is characterized by an IMMEDIATE ASSIGNMENT procedure (assignment of an SDCCH or TCH) which is not followed by sending an ESTABLISH INDICATION from MS to BTS. In this paper a mathematical model for evaluation of the Power RACH Busy Threshold (RACHBT) in order to guaranty in advance determined outage probability on RACH is described and discussed as well. It focuses on Global System for Mobile Communications (GSM) however the obtained results can be generalized on remaining mobile technologies (ie WCDMA and LTE).

  11. Minimally invasive procedures

    PubMed Central

    Baltayiannis, Nikolaos; Michail, Chandrinos; Lazaridis, George; Anagnostopoulos, Dimitrios; Baka, Sofia; Mpoukovinas, Ioannis; Karavasilis, Vasilis; Lampaki, Sofia; Papaiwannou, Antonis; Karavergou, Anastasia; Kioumis, Ioannis; Pitsiou, Georgia; Katsikogiannis, Nikolaos; Tsakiridis, Kosmas; Rapti, Aggeliki; Trakada, Georgia; Zissimopoulos, Athanasios; Zarogoulidis, Konstantinos

    2015-01-01

    Minimally invasive procedures, which include laparoscopic surgery, use state-of-the-art technology to reduce the damage to human tissue when performing surgery. Minimally invasive procedures require small “ports” from which the surgeon inserts thin tubes called trocars. Carbon dioxide gas may be used to inflate the area, creating a space between the internal organs and the skin. Then a miniature camera (usually a laparoscope or endoscope) is placed through one of the trocars so the surgical team can view the procedure as a magnified image on video monitors in the operating room. Specialized equipment is inserted through the trocars based on the type of surgery. There are some advanced minimally invasive surgical procedures that can be performed almost exclusively through a single point of entry—meaning only one small incision, like the “uniport” video-assisted thoracoscopic surgery (VATS). Not only do these procedures usually provide equivalent outcomes to traditional “open” surgery (which sometimes require a large incision), but minimally invasive procedures (using small incisions) may offer significant benefits as well: (I) faster recovery; (II) the patient remains for less days hospitalized; (III) less scarring and (IV) less pain. In our current mini review we will present the minimally invasive procedures for thoracic surgery. PMID:25861610

  12. Automatic scheduling of outages of nuclear power plants with time windows. Final report, January-December 1995

    SciTech Connect

    Gomes, C.

    1996-10-01

    This report describes a successful project for transference of advanced AI technology into the domain of planning of outages of nuclear power plants as part of DOD`s dual-use program. ROMAN (Rome Lab Outage Manager) is the prototype system that was developed as a result of this project. ROMAN`s main innovation compared to the current state-of-the-art of outage management tools is its capability to automatically enforce safety constraints during the planning and scheduling phase. Another innovative aspect of ROMAN is the generation of more robust schedules that are feasible over time windows. In other words, ROMAN generates a family of schedules by assigning time intervals as start times to activities rather than single start times, without affecting the overall duration of the project. ROMAN uses a constraint satisfaction paradigm combining a global search tactic with constraint propagation. The derivation of very specialized representations for the constraints to perform efficient propagation is a key aspect for the generation of very fast schedules - constraints are compiled into the code, which is a novel aspect of our work using an automatic programming system, KIDS.

  13. Outage Performance Analysis of Relay Selection Schemes in Wireless Energy Harvesting Cooperative Networks over Non-Identical Rayleigh Fading Channels.

    PubMed

    Do, Nhu Tri; Bao, Vo Nguyen Quoc; An, Beongku

    2016-01-01

    In this paper, we study relay selection in decode-and-forward wireless energy harvesting cooperative networks. In contrast to conventional cooperative networks, the relays harvest energy from the source's radio-frequency radiation and then use that energy to forward the source information. Considering power splitting receiver architecture used at relays to harvest energy, we are concerned with the performance of two popular relay selection schemes, namely, partial relay selection (PRS) scheme and optimal relay selection (ORS) scheme. In particular, we analyze the system performance in terms of outage probability (OP) over independent and non-identical (i.n.i.d.) Rayleigh fading channels. We derive the closed-form approximations for the system outage probabilities of both schemes and validate the analysis by the Monte-Carlo simulation. The numerical results provide comprehensive performance comparison between the PRS and ORS schemes and reveal the effect of wireless energy harvesting on the outage performances of both schemes. Additionally, we also show the advantages and drawbacks of the wireless energy harvesting cooperative networks and compare to the conventional cooperative networks. PMID:26927119

  14. Minimally Invasive Radiofrequency Devices.

    PubMed

    Sadick, Neil; Rothaus, Kenneth O

    2016-07-01

    This article reviews minimally invasive radiofrequency options for skin tightening, focusing on describing their mechanism of action and clinical profile in terms of safety and efficacy and presenting peer-reviewed articles associated with the specific technologies. Treatments offered by minimally invasive radiofrequency devices (fractional, microneedling, temperature-controlled) are increasing in popularity due to the dramatic effects they can have without requiring skin excision, downtime, or even extreme financial burden from the patient's perspective. Clinical applications thus far have yielded impressive results in treating signs of the aging face and neck, either as stand-alone or as postoperative maintenance treatments. PMID:27363771

  15. Down-select ion specific media (ISM) utilization in upset and outage conditions

    SciTech Connect

    Denton, Mark S.; Bostick, William D.

    2007-07-01

    This paper presents a process that has been used to help nuclear power plant (NPP) clients resolve some of their more challenging waste water processing issues. These treatment issues may become even more evident during outage conditions, due (in part) to associated decontamination activities that may cause off-normal chemical conditions, which may subsequently change both the peak levels of activities for radionuclides introduced into the collected waste water and also the chemical forms in which they may exist (e.g., formation of colloids or soluble chelates). In one NPP waste processing example, a large proportion of soluble Co-58, which is normally present as a soluble cationic species or an uncharged colloidal solid, was found to behave like an anion; formation of an anionic chelation complex was implicated, possibly due to suspect EDTA, or similar additive, in a proprietary decontamination soap formulation. Antimony 125 (Sb{sup 125}), normally present as a weakly anionic (Sb(OH){sub 6}{sup -}) or even neutral (Sb(OH){sub 3}{sup 0}) species, was being displaced from previously-loaded media by other, more strongly bound species, causing an unacceptable peak activity in water intended for discharge. A quick resolution of the existing waste processing limitations was required, due to limited waste water holding capacity. Samples of the authentic NPP waste water containing the recalcitrant radionuclides were sent to our licensed off-site laboratory (MCLinc), where small-scale batch-equilibrium testing was used to down-select, from a large number (36) of candidate media (both commercially available and developed internally), those that were relatively effective and economical for use in achieving the required discharge criteria. Batch equilibrium testing is very efficient for use in screening the relative effectiveness of contaminant removal by candidate media in a select waste water composition, and can also provide an estimate of the ultimate contaminant loading

  16. Space-based Scintillation Nowcasting with the Communications/Navigation Outage Forecast System

    NASA Astrophysics Data System (ADS)

    Groves, K.; Starks, M.; Beach, T.; Basu, S.

    2008-12-01

    The Air Force Research Laboratory's Communication/Navigation Outage Forecast System (C/NOFS) fuses ground- and space-based data in a near real-time physics-based model aimed at forecasting and nowcasting equatorial scintillations and their impacts on satellite communications and navigation. A key component of the system is the C/NOFS satellite that was launched into a low-inclination (13°) elliptical orbit (400 km x 850 km) in April 2008. The satellite contains six sensors to measure space environment parameters including electron density and temperature, ion density and drift, electric and magnetic fields and neutral wind, as well as a tri-band radio beacon transmitting at 150 MHz, 400 MHz and 1067 MHz. Scintillation nowcasts are derived from measuring the one-dimensional in situ electron density fluctuations and subsequently modeling the propagation environment for satellite-to-ground radio links. The modeling process requires a number of simplifying assumptions regarding the three-dimensional structure of the ionosphere and the results are readily validated by comparisons with ground-based measurements of the satellite's tri-band beacon signals. In mid-September 2008 a campaign to perform detailed analyses of space-based scintillation nowcasts with numerous ground observations was conducted in the vicinity of Kwajalein Atoll, Marshall Islands. To maximize the collection of ground-truth data, the ALTAIR radar was employed to obtain detailed information on the spatial structure of the ionosphere during the campaign and to aid the improvement of space-based nowcasting algorithms. A comparison of these results will be presented; it appears that detailed information on the electron density structure is a limiting factor in modeling the scintillation environment from in situ observations.

  17. Ways To Minimize Bullying.

    ERIC Educational Resources Information Center

    Mueller, Mary Ellen; Parisi, Mary Joy

    This report delineates a series of interventions aimed at minimizing incidences of bullying in a suburban elementary school. The social services staff was scheduled to initiate an anti-bullying incentive in fall 2001 due to the increased occurrences of bullying during the prior year. The target population consisted of third- and fourth-grade…

  18. Periodic minimal surfaces

    NASA Astrophysics Data System (ADS)

    Mackay, Alan L.

    1985-04-01

    A minimal surface is one for which, like a soap film with the same pressure on each side, the mean curvature is zero and, thus, is one where the two principal curvatures are equal and opposite at every point. For every closed circuit in the surface, the area is a minimum. Schwarz1 and Neovius2 showed that elements of such surfaces could be put together to give surfaces periodic in three dimensions. These periodic minimal surfaces are geometrical invariants, as are the regular polyhedra, but the former are curved. Minimal surfaces are appropriate for the description of various structures where internal surfaces are prominent and seek to adopt a minimum area or a zero mean curvature subject to their topology; thus they merit more complete numerical characterization. There seem to be at least 18 such surfaces3, with various symmetries and topologies, related to the crystallographic space groups. Recently, glyceryl mono-oleate (GMO) was shown by Longley and McIntosh4 to take the shape of the F-surface. The structure postulated is shown here to be in good agreement with an analysis of the fundamental geometry of periodic minimal surfaces.

  19. Minimally invasive pancreatic surgery.

    PubMed

    Yiannakopoulou, E

    2015-12-01

    Minimally invasive pancreatic surgery is feasible and safe. Laparoscopic distal pancreatectomy should be widely adopted for benign lesions of the pancreas. Laparoscopic pancreaticoduodenectomy, although technically demanding, in the setting of pancreatic ductal adenocarcinoma has a number of advantages including shorter hospital stay, faster recovery, allowing patients to recover in a timelier manner and pursue adjuvant treatment options. Furthermore, it seems that progression-free survival is longer in patients undergoing laparoscopic pancreaticoduodenectomy in comparison with those undergoing open pancreaticoduodenectomy. Minimally invasive middle pancreatectomy seems appropriate for benign or borderline tumors of the neck of the pancreas. Technological advances including intraoperative ultrasound and intraoperative fluorescence imaging systems are expected to facilitate the wide adoption of minimally invasive pancreatic surgery. Although, the oncological outcome seems similar with that of open surgery, there are still concerns, as the majority of relevant evidence comes from retrospective studies. Large multicenter randomized studies comparing laparoscopic with open pancreatectomy as well as robotic assisted with both open and laparoscopic approaches are needed. Robotic approach could be possibly shown to be less invasive than conventional laparoscopic approach through the less traumatic intra-abdominal handling of tissues. In addition, robotic approach could enable the wide adoption of the technique by surgeon who is not that trained in advanced laparoscopic surgery. A putative clinical benefit of minimally invasive pancreatic surgery could be the attenuated surgical stress response leading to reduced morbidity and mortality as well as lack of the detrimental immunosuppressive effect especially for the oncological patients. PMID:26530291

  20. The Minimal Era

    ERIC Educational Resources Information Center

    Van Ness, Wilhelmina

    1974-01-01

    Described the development of Minimal Art, a composite name that has been applied to the scattering of bland, bleak, non-objective fine arts painting and sculpture forms that proliferated slightly mysteriously in the middle 1960's as Pop Art began to decline. (Author/RK)

  1. Waste Minimization Crosscut Plan

    SciTech Connect

    Not Available

    1992-05-13

    On November 27, 1991, the Secretary of Energy directed that a Department of Energy (DOE) crosscut plan for waste minimization (WMin) be prepared and submitted by March 1, 1992. This Waste Minimization Crosscut Plan responds to the Secretary`s direction and supports the National Energy Strategy (NES) goals of achieving greater energy security, increasing energy and economic efficiency, and enhancing environmental quality. It provides a DOE-wide planning framework for effective coordination of all DOE WMin activities. This Plan was jointly prepared by the following Program Secretarial Officer (PSO) organizations: Civilian Radioactive Waste Management (RW); Conservation and Renewable Energy (CE); Defense Programs (DP); Environmental Restoration and Waste Management (EM), lead; Energy Research (ER); Fossil Energy (FE); Nuclear Energy (NE); and New Production Reactors (NP). Assistance and guidance was provided by the offices of Policy, Planning, and Analysis (PE) and Environment, Safety and Health (EH). Comprehensive application of waste minimization within the Department and in both the public and private sectors will provide significant benefits and support National Energy Strategy goals. These benefits include conservation of a substantial proportion of the energy now used by industry and Government, improved environmental quality, reduced health risks, improved production efficiencies, and longer useful life of disposal capacity. Taken together, these benefits will mean improved US global competitiveness, expanded job opportunities, and a better quality of life for all citizens.

  2. Minimally invasive radioguided parathyroidectomy.

    PubMed

    Costello, D; Norman, J

    1999-07-01

    The last decade has been characterized by an emphasis on minimizing interventional techniques, hospital stays, and overall costs of patient care. It is clear that most patients with sporadic HPT do not require a complete neck exploration. We now know that a minimal approach is appropriate for this disease. Importantly, the MIRP technique can be applied to most patients with sporadic HPT and can be performed by surgeons with modest advanced training. The use of a gamma probe as a surgical tool converts the sestamibi to a functional and anatomical scan eliminating the need for any other preoperative localizing study. Quantification of the radioactivity within the removed gland eliminates the need for routine frozen section histologic examination and obviates the need for costly intraoperative parathyroid hormone measurements. This radioguided technique allows the benefit of local anesthesia, dramatically reduces operative times, eliminates postoperative blood tests, provides a smaller scar, requires minimal time spent in the hospital, and almost assures a rapid, near pain-free recovery. This combination is beneficial to the patient whereas helping achieve a reduction in overall costs. PMID:10448697

  3. Minimally invasive mediastinal surgery

    PubMed Central

    Melfi, Franca M. A.; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a “no-touch” technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally

  4. Minimally invasive mediastinal surgery.

    PubMed

    Melfi, Franca M A; Fanucchi, Olivia; Mussi, Alfredo

    2016-01-01

    In the past, mediastinal surgery was associated with the necessity of a maximum exposure, which was accomplished through various approaches. In the early 1990s, many surgical fields, including thoracic surgery, observed the development of minimally invasive techniques. These included video-assisted thoracic surgery (VATS), which confers clear advantages over an open approach, such as less trauma, short hospital stay, increased cosmetic results and preservation of lung function. However, VATS is associated with several disadvantages. For this reason, it is not routinely performed for resection of mediastinal mass lesions, especially those located in the anterior mediastinum, a tiny and remote space that contains vital structures at risk of injury. Robotic systems can overcome the limits of VATS, offering three-dimensional (3D) vision and wristed instrumentations, and are being increasingly used. With regards to thymectomy for myasthenia gravis (MG), unilateral and bilateral VATS approaches have demonstrated good long-term neurologic results with low complication rates. Nevertheless, some authors still advocate the necessity of maximum exposure, especially when considering the distribution of normal and ectopic thymic tissue. In recent studies, the robotic approach has shown to provide similar neurological outcomes when compared to transsternal and VATS approaches, and is associated with a low morbidity. Importantly, through a unilateral robotic technique, it is possible to dissect and remove at least the same amount of mediastinal fat tissue. Preliminary results on early-stage thymomatous disease indicated that minimally invasive approaches are safe and feasible, with a low rate of pleural recurrence, underlining the necessity of a "no-touch" technique. However, especially for thymomatous disease characterized by an indolent nature, further studies with long follow-up period are necessary in order to assess oncologic and neurologic results through minimally invasive

  5. Minimally refined biomass fuel

    DOEpatents

    Pearson, Richard K.; Hirschfeld, Tomas B.

    1984-01-01

    A minimally refined fluid composition, suitable as a fuel mixture and derived from biomass material, is comprised of one or more water-soluble carbohydrates such as sucrose, one or more alcohols having less than four carbons, and water. The carbohydrate provides the fuel source; water solubilizes the carbohydrates; and the alcohol aids in the combustion of the carbohydrate and reduces the vicosity of the carbohydrate/water solution. Because less energy is required to obtain the carbohydrate from the raw biomass than alcohol, an overall energy savings is realized compared to fuels employing alcohol as the primary fuel.

  6. Wake Vortex Minimization

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A status report is presented on research directed at reducing the vortex disturbances of aircraft wakes. The objective of such a reduction is to minimize the hazard to smaller aircraft that might encounter these wakes. Inviscid modeling was used to study trailing vortices and viscous effects were investigated. Laser velocimeters were utilized in the measurement of aircraft wakes. Flight and wind tunnel tests were performed on scale and full model scale aircraft of various design. Parameters investigated included the effect of wing span, wing flaps, spoilers, splines and engine thrust on vortex attenuation. Results indicate that vortives may be alleviated through aerodynamic means.

  7. The ZOOM minimization package

    SciTech Connect

    Fischler, Mark S.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  8. Minimally Invasive Parathyroidectomy

    PubMed Central

    Starker, Lee F.; Fonseca, Annabelle L.; Carling, Tobias; Udelsman, Robert

    2011-01-01

    Minimally invasive parathyroidectomy (MIP) is an operative approach for the treatment of primary hyperparathyroidism (pHPT). Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT. PMID:21747851

  9. Minimizing hazardous waste

    SciTech Connect

    DeClue, S.C.

    1996-06-01

    Hazardous waste minimization is a broad term often associated with pollution prevention, saving the environment or protecting Mother Earth. Some associate hazardous waste minimization with saving money. Thousands of hazardous materials are used in processes every day, but when these hazardous materials become hazardous wastes, dollars must be spent for disposal. When hazardous waste is reduced, an organization will spend less money on hazardous waste disposal. In 1993, Fort Bragg reduced its hazardous waste generation by over 100,000 pounds and spent nearly $90,000 less on hazardous waste disposal costs than in 1992. Fort Bragg generates a variety of wastes: Vehicle maintenance wastes such as antifreeze, oil, grease and solvents; helicopter maintenance wastes, including solvents, adhesives, lubricants and paints; communication operation wastes such as lithium, magnesium, mercury and nickel-cadmium batteries; chemical defense wastes detection, decontamination, and protective mask filters. The Hazardous Waste Office has the responsibility to properly identify, characterize, classify and dispose of these waste items in accordance with US Environmental Protection Agency (EPA) and US Department of Transportation (DOT) regulations.

  10. High-Rate Communications Outage Recorder Operations for Optimal Payload and Science Telemetry Management Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Shell, Michael T.; McElyea, Richard M. (Technical Monitor)

    2002-01-01

    All International Space Station (ISS) Ku-band telemetry transmits through the High-Rate Communications Outage Recorder (HCOR). The HCOR provides the recording and playback capability for all payload, science, and International Partner data streams transmitting through NASA's Ku-band antenna system. The HCOR is a solid-state memory recorder that provides recording capability to record all eight ISS high-rate data during ISS Loss-of-Signal periods. NASA payloads in the Destiny module are prime users of the HCOR; however, NASDA and ESA will also utilize the HCOR for data capture and playback of their high data rate links from the Kibo and Columbus modules. Marshall Space Flight Center's Payload Operations Integration Center manages the HCOR for nominal functions, including system configurations and playback operations. The purpose of this paper is to present the nominal operations plan for the HCOR and the plans for handling contingency operations affecting payload operations. In addition, the paper will address HCOR operation limitations and the expected effects on payload operations. The HCOR is manifested for ISS delivery on flight 9A with the HCOR backup manifested on flight 11A. The HCOR replaces the Medium-Rate Communications Outage Recorder (MCOR), which has supported payloads since flight 5A.1.

  11. Minimal noise subsystems

    NASA Astrophysics Data System (ADS)

    Wang, Xiaoting; Byrd, Mark; Jacobs, Kurt

    2016-03-01

    A system subjected to noise contains a decoherence-free subspace or subsystem (DFS) only if the noise possesses an exact symmetry. Here we consider noise models in which a perturbation breaks a symmetry of the noise, so that if S is a DFS under a given noise process it is no longer so under the new perturbed noise process. We ask whether there is a subspace or subsystem that is more robust to the perturbed noise than S . To answer this question we develop a numerical method that allows us to search for subspaces or subsystems that are maximally robust to arbitrary noise processes. We apply this method to a number of examples, and find that a subsystem that is a DFS is often not the subsystem that experiences minimal noise when the symmetry of the noise is broken by a perturbation. We discuss which classes of noise have this property.

  12. Minimal quiver standard model

    SciTech Connect

    Berenstein, David; Pinansky, Samuel

    2007-05-01

    This paper discusses the minimal quiver gauge theory embedding of the standard model that could arise from brane world type string theory constructions. It is based on the low energy effective field theory of D branes in the perturbative regime. The model differs from the standard model by the addition of one extra massive gauge boson, and contains only one additional parameter to the standard model: the mass of this new particle. The coupling of this new particle to the standard model is uniquely determined by input from the standard model and consistency conditions of perturbative string theory. We also study some aspects of the phenomenology of this model and bounds on its possible observation at the Large Hadron Collider.

  13. [Minimally invasive breast surgery].

    PubMed

    Mátrai, Zoltán; Gulyás, Gusztáv; Kunos, Csaba; Sávolt, Akos; Farkas, Emil; Szollár, András; Kásler, Miklós

    2014-02-01

    Due to the development in medical science and industrial technology, minimally invasive procedures have appeared in the surgery of benign and malignant breast diseases. In general , such interventions result in significantly reduced breast and chest wall scars, shorter hospitalization and less pain, but they require specific, expensive devices, longer surgical time compared to open surgery. Furthermore, indications or oncological safety have not been established yet. It is quite likely, that minimally invasive surgical procedures with high-tech devices - similar to other surgical subspecialties -, will gradually become popular and it may form part of routine breast surgery even. Vacuum-assisted core biopsy with a therapeutic indication is suitable for the removal of benign fibroadenomas leaving behind an almost invisible scar, while endoscopically assisted skin-sparing and nipple-sparing mastectomy, axillary staging and reconstruction with latissimus dorsi muscle flap are all feasible through the same short axillary incision. Endoscopic techniques are also suitable for the diagnostics and treatment of intracapsular complications of implant-based breast reconstructions (intracapsular fluid, implant rupture, capsular contracture) and for the biopsy of intracapsular lesions with uncertain pathology. Perception of the role of radiofrequency ablation of breast tumors requires further hands-on experience, but it is likely that it can serve as a replacement of surgical removal in a portion of primary tumors in the future due to the development in functional imaging and anticancer drugs. With the reduction of the price of ductoscopes routine examination of the ductal branch system, guided microdochectomy and targeted surgical removal of terminal ducto-lobular units or a "sick lobe" as an anatomical unit may become feasible. The paper presents the experience of the authors and provides a literature review, for the first time in Hungarian language on the subject. Orv. Hetil

  14. Minimally invasive parathyroid surgery

    PubMed Central

    Noureldine, Salem I.; Gooi, Zhen

    2015-01-01

    Traditionally, bilateral cervical exploration for localization of all four parathyroid glands and removal of any that are grossly enlarged has been the standard surgical treatment for primary hyperparathyroidism (PHPT). With the advances in preoperative localization studies and greater public demand for less invasive procedures, novel targeted, minimally invasive techniques to the parathyroid glands have been described and practiced over the past 2 decades. Minimally invasive parathyroidectomy (MIP) can be done either through the standard Kocher incision, a smaller midline incision, with video assistance (purely endoscopic and video-assisted techniques), or through an ectopically placed, extracervical, incision. In current practice, once PHPT is diagnosed, preoperative evaluation using high-resolution radiographic imaging to localize the offending parathyroid gland is essential if MIP is to be considered. The imaging study results suggest where the surgeon should begin the focused procedure and serve as a road map to allow tailoring of an efficient, imaging-guided dissection while eliminating the unnecessary dissection of multiple glands or a bilateral exploration. Intraoperative parathyroid hormone (IOPTH) levels may be measured during the procedure, or a gamma probe used during radioguided parathyroidectomy, to ascertain that the correct gland has been excised and that no other hyperfunctional tissue is present. MIP has many advantages over the traditional bilateral, four-gland exploration. MIP can be performed using local anesthesia, requires less operative time, results in fewer complications, and offers an improved cosmetic result and greater patient satisfaction. Additional advantages of MIP are earlier hospital discharge and decreased overall associated costs. This article aims to address the considerations for accomplishing MIP, including the role of preoperative imaging studies, intraoperative adjuncts, and surgical techniques. PMID:26425454

  15. Minimal Marking: A Success Story

    ERIC Educational Resources Information Center

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  16. Minimal distances between SCFTs

    NASA Astrophysics Data System (ADS)

    Buican, Matthew

    2014-01-01

    We study lower bounds on the minimal distance in theory space between four-dimensional superconformal field theories (SCFTs) connected via broad classes of renormalization group (RG) flows preserving various amounts of supersymmetry (SUSY). For = 1 RG flows, the ultraviolet (UV) and infrared (IR) endpoints of the flow can be parametrically close. On the other hand, for RG flows emanating from a maximally supersymmetric SCFT, the distance to the IR theory cannot be arbitrarily small regardless of the amount of (non-trivial) SUSY preserved along the flow. The case of RG flows from =2 UV SCFTs is more subtle. We argue that for RG flows preserving the full =2 SUSY, there are various obstructions to finding examples with parametrically close UV and IR endpoints. Under reasonable assumptions, these obstructions include: unitarity, known bounds on the c central charge derived from associativity of the operator product expansion, and the central charge bounds of Hofman and Maldacena. On the other hand, for RG flows that break = 2 → = 1, it is possible to find IR fixed points that are parametrically close to the UV ones. In this case, we argue that if the UV SCFT possesses a single stress tensor, then such RG flows excite of order all the degrees of freedom of the UV theory. Furthermore, if the UV theory has some flavor symmetry, we argue that the UV central charges should not be too large relative to certain parameters in the theory.

  17. Swarm robotics and minimalism

    NASA Astrophysics Data System (ADS)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  18. Minimal Higgs inflation

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Oda, Kin-ya

    2014-02-01

    We consider a possibility that the Higgs field in the Standard Model (SM) serves as an inflaton when its value is around the Planck scale. We assume that the SM is valid up to an ultraviolet cutoff scale Λ , which is slightly below the Planck scale, and that the Higgs potential becomes almost flat above Λ . Contrary to the ordinary Higgs inflation scenario, we do not assume the huge non-minimal coupling, of O(10^4), of the Higgs field to the Ricci scalar. We find that Λ must be less than 5× 10^{17} {GeV} in order to explain the observed fluctuation of the cosmic microwave background, no matter how we extrapolate the Higgs potential above Λ . The scale 10^{17} {GeV} coincides with the perturbative string scale, which suggests that the SM is directly connected with string theory. For this to be true, the top quark mass is restricted to around 171 GeV, with which Λ can exceed 10^{17} {GeV}. As a concrete example of the potential above Λ , we propose a simple log-type potential. The predictions of this specific model for the e-foldings N_*=50-60 are consistent with the current observation, namely, the scalar spectral index is n_s=0.977hbox {-}0.983 and the tensor to scalar ratio 0

  19. Lightning-Generated Whistler Waves Observed by Probes On The Communication/Navigation Outage Forecast System Satellite at Low Latitudes

    NASA Technical Reports Server (NTRS)

    Holzworth, R. H.; McCarthy, M. P.; Pfaff, R. F.; Jacobson, A. R.; Willcockson, W. L.; Rowland, D. E.

    2011-01-01

    Direct evidence is presented for a causal relationship between lightning and strong electric field transients inside equatorial ionospheric density depletions. In fact, these whistler mode plasma waves may be the dominant electric field signal within such depletions. Optical lightning data from the Communication/Navigation Outage Forecast System (C/NOFS) satellite and global lightning location information from the World Wide Lightning Location Network are presented as independent verification that these electric field transients are caused by lightning. The electric field instrument on C/NOFS routinely measures lightning ]related electric field wave packets or sferics, associated with simultaneous measurements of optical flashes at all altitudes encountered by the satellite (401.867 km). Lightning ]generated whistler waves have abundant access to the topside ionosphere, even close to the magnetic equator.

  20. Outage Performance and Average Symbol Error Rate of M-QAM for Maximum Ratio Combining with Multiple Interferers

    NASA Astrophysics Data System (ADS)

    Ahn, Kyung Seung

    In this paper, we investigate the performance of maximum ratio combining (MRC) in the presence of multiple cochannel interferences over a flat Rayleigh fading channel. Closed-form expressions of signal-to-interference-plus-noise ratio (SINK), outage probability, and average symbol error rate (SER) of quadrature amplitude modulation (QAM) with Mary signaling are obtained for unequal-power interference-to-noise ratio (INR). We also provide an upper-bound for the average SER using moment generating function (MGF) of the SINR. Moreover, we quantify the array gain loss between pure MRC (MRC system in the absence of CCI) and MRC system in the presence of CCI. Finally, we verify our analytical results by numerical simulations.

  1. The Fixed-bias Langmuir Probe on the Communication-navigation Outage Forecast System Satellite: Calibration and Validation

    NASA Technical Reports Server (NTRS)

    Klenzing, Jeffrey H.; Rowland, Douglas E.

    2012-01-01

    A fixed-bias spherical Langmuir probe is included as part of the Vector Electric Field Instrument (VEFI) suite on the Communication Navigation Outage Forecast System (CNOFS) satellite.CNOFS gathers data in the equatorial ionosphere between 400 and 860 km, where the primary constituent ions are H+ and O+. The ion current collected by the probe surface per unit plasmadensity is found to be a strong function of ion composition. The calibration of the collected current to an absolute density is discussed, and the performance of the spherical probe is compared to other in situ instruments on board the CNOFS satellite. The application of the calibration is discussed with respect to future xed-bias probes; in particular, it is demonstrated that some density fluctuations will be suppressed in the collected current if the plasma composition rapidly changes along with density. This is illustrated in the observation of plasma density enhancements on CNOFS.

  2. The Fixed-Bias Langmuir Probe on the Communication-Navigation Outage Forecast System Satellite: Calibration and Validation

    NASA Technical Reports Server (NTRS)

    Klenzing, J.; Rowland, D.

    2012-01-01

    A fixed-bias spherical Langmuir probe is included as part of the Vector Electric Field Instrument (VEFI) suite on the Communication Navigation Outage Forecast System (CNOFS) satellite.CNOFS gathers data in the equatorial ionosphere between 400 and 860 km, where the primary constituent ions are H+ and O+. The ion current collected by the probe surface per unit plasma density is found to be a strong function of ion composition. The calibration of the collected current to an absolute density is discussed, and the performance of the spherical probe is compared to other in situ instruments on board the CNOFS satellite. The application of the calibration is discussed with respect to future fixed-bias probes; in particular, it is demonstrated that some density fluctuations will be suppressed in the collected current if the plasma composition rapidly changes along with density. This is illustrated in the observation of plasma density enhancements on CNOFS.

  3. Evaluation of the resilience of a full-scale down-flow hanging sponge reactor to long-term outages at a sewage treatment plant in India.

    PubMed

    Onodera, Takashi; Takayama, Daisuke; Ohashi, Akiyoshi; Yamaguchi, Takashi; Uemura, Shigeki; Harada, Hideki

    2016-10-01

    Resilience to process outages is an essential requirement for sustainable wastewater treatment systems in developing countries. In this study, we evaluated the ability of a full-scale down-flow hanging sponge (DHS) reactor to recover after a 10-day outage. The DHS tested in this study uses polyurethane sponge as packing material. This full-scale DHS reactor has been tested over a period of about 4 years in India with a flow rate of 500 m(3)/day. Water was not supplied to the DHS reactor that was subjected to the 10-day outage; however, the biomass did not dry out because the sponge was able to retain enough water. Soon after the reactor was restarted, a small quantity of biomass, amounting to only 0.1% of the total retained biomass, was eluted. The DHS effluent achieved satisfactory removal of suspended solids, chemical oxygen demand, and ammonium nitrogen within 90, 45, and 90 min, respectively. Conversely, fecal coliforms in the DHS effluent did not reach satisfactory levels within 540 min; instead, the normal levels of fecal coliforms were achieved within 3 days. Overall, the tests demonstrated that the DHS reactor was sufficiently robust to withstand long-term outages and achieved steady state soon after restart. This reinforces the suitability of this technology for developing countries. PMID:27450993

  4. Influenza SIRS with Minimal Pneumonitis

    PubMed Central

    Erramilli, Shruti; Mannam, Praveen; Manthous, Constantine A.

    2016-01-01

    Although systemic inflammatory response syndrome (SIRS) is a known complication of severe influenza pneumonia, it has been reported very rarely in patients with minimal parenchymal lung disease. We here report a case of severe SIRS, anasarca, and marked vascular phenomena with minimal or no pneumonitis. This case highlights that viruses, including influenza, may cause vascular dysregulation causing SIRS, even without substantial visceral organ involvement.

  5. Guidelines for mixed waste minimization

    SciTech Connect

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization.

  6. Waste minimization handbook, Volume 1

    SciTech Connect

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  7. Minimizing waste in environmental restoration

    SciTech Connect

    Moos, L.; Thuot, J.R.

    1996-07-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs.

  8. Mitral valve surgery - minimally invasive

    MedlinePlus

    ... that does many of these procedures. Minimally invasive heart valve surgery has improved greatly in recent years. These ... WT, Mack MJ. Transcatheter cardiac valve interventions. Surg Clin North Am . 2009;89:951-66. ...

  9. Heart bypass surgery - minimally invasive

    MedlinePlus

    ... in 30-day outcomes in high-risk patients randomized to off-pump versus on-pump coronary bypass ... Thiele H, Neumann-Schniedewind P, Jacobs S, et al. Randomized comparison of minimally invasive direct coronary artery bypass ...

  10. An Overview of Scientific and Space Weather Results from the Communication/Navigation Outage Forecasting System (C/NOFS) Mission

    NASA Technical Reports Server (NTRS)

    Pfaff, R.; de la Beaujardiere, O.; Hunton, D.; Heelis, R.; Earle, G.; Strauss, P.; Bernhardt, P.

    2012-01-01

    The Communication/Navigation Outage Forecasting System (C/NOFS) Mission of the Air Force Research Laboratory is described. C/NOFS science objectives may be organized into three categories: (1) to understand physical processes active in the background ionosphere and thermosphere in which plasma instabilities grow; (2) to identify mechanisms that trigger or quench the plasma irregularities responsible for signal degradation; and (3) to determine how the plasma irregularities affect the propagation of electromagnetic waves. The satellite was launched in April, 2008 into a low inclination (13 deg), elliptical (400 x 850 km) orbit. The satellite sensors measure the following parameters in situ: ambient and fluctuating electron densities, AC and DC electric and magnetic fields, ion drifts and large scale ion composition, ion and electron temperatures, and neutral winds. C/NOFS is also equipped with a GPS occultation receiver and a radio beacon. In addition to the satellite sensors, complementary ground-based measurements, theory, and advanced modeling techniques are also important parts of the mission. We report scientific and space weather highlights of the mission after nearly four years in orbit

  11. The fixed-bias Langmuir probe on the Communication/Navigation Outage Forecast System satellite: calibration and validation.

    PubMed

    Klenzing, J; Rowland, D

    2012-11-01

    A fixed-bias spherical Langmuir probe is included as part of the Vector Electric Field Instrument (VEFI) suite on the Communication/Navigation Outage Forecast System (C/NOFS) satellite. C/NOFS gathers data in the equatorial ionosphere between 400 and 860 km, where the primary constituent ions are H(+) and O(+). The ion current collected by the probe surface per unit plasma density is found to be a strong function of ion composition. The calibration of the collected current to an absolute density is discussed, and the performance of the spherical probe is compared to other in situ instruments on board the C/NOFS satellite. The application of the calibration is discussed with respect to future fixed-bias probes; in particular, it is demonstrated that some density fluctuations will be suppressed in the collected current if the plasma composition rapidly changes along with density. This is illustrated in the observation of plasma density enhancements on C/NOFS. PMID:23206077

  12. Minimizing pollutants with multimedia strategies

    SciTech Connect

    Phillips, J.B.; Hindawi, M.A.

    1997-01-01

    A multimedia approach to pollution prevention that focuses on minimizing or eliminating production of pollutants is one of the most advantageous strategies to adopt in preparing an overall facility environmental plan. If processes are optimized to preclude or minimize the manufacture of streams containing pollutants, or to reduce the levels of pollutants in waste streams, then the task of multimedia pollution prevention becomes more manageable simply as a result of a smaller problem needing to be addressed. An orderly and systematic approach to waste minimization can result in a comprehensive strategy to reduce the production of waste streams and simultaneously improve the profitability of a process or industrial operation. There are a number of miscellaneous strategies for a waste minimization that attack the problem via process chemistry or engineering. Examples include installation of low-NO{sub x} burners, selection of valves that minimize fugitive emissions, high-level switches on storage tanks, the use of in-plant stills for recycling and reusing solvents and using water-based products instead of hydrocarbon-based products wherever possible. Other waste minimization countermeasures can focus on O and M issues.

  13. Pinnacle3 modeling and end-to-end dosimetric testing of a Versa HD linear accelerator with the Agility head and flattening filter-free modes.

    PubMed

    Saenz, Daniel L; Narayanasamy, Ganesh; Cruz, Wilbert; Papanikolaou, Nikos; Stathakis, Sotirios

    2016-01-01

    The Elekta Versa HD incorporates a variety of upgrades to the line of Elekta linear accelerators, primarily including the Agility head and flattening filter-free (FFF) photon beam delivery. The completely distinct dosimetric output of the head from its predecessors, combined with the FFF beams, requires a new investigation of modeling in treatment planning systems. A model was created in Pinnacle3 v9.8 with the commissioned beam data. A phantom consisting of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle3, where beams of different field sizes, source-to-surface distances (SSDs), wedges, and gantry angles were devised. Beams included all of the available photon energies (6, 10, 18, 6FFF, and 10 FFF MV), as well as the four electron energies commissioned for clinical use (6, 9, 12, and 15 MeV). The plans were verified at calculation points by measurement with a calibrated ionization chamber. Homogeneous and hetero-geneous point-dose measurements agreed within 2% relative to maximum dose for all photon and electron beams. AP photon open field measurements along the central axis at 100 cm SSD passed within 1%. In addition, IMRT testing was also performed with three standard plans (step and shoot IMRT, as well as a small- and large-field VMAT plan). The IMRT plans were delivered on the Delta4 IMRT QA phantom, for which a gamma passing rate was > 99.5% for all plans with a 3% dose deviation, 3 mm distance-to-agreement, and 10% dose threshold. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4% ± 2.3%. Such testing ensures confidence in the ability of Pinnacle3 to model photon and electron beams with the Agility head. PMID:26894352

  14. EXSdetect: an end-to-end software for extended source detection in X-ray images: application to Swift-XRT data

    NASA Astrophysics Data System (ADS)

    Liu, T.; Tozzi, P.; Tundo, E.; Moretti, A.; Wang, J.-X.; Rosati, P.; Guglielmetti, F.

    2013-01-01

    Aims: We present a stand-alone software (named EXSdetect) for the detection of extended sources in X-ray images. Our goal is to provide a flexible tool capable of detecting extended sources down to the lowest flux levels attainable within instrumental limitations, while maintaining robust photometry, high completeness, and low contamination, regardless of source morphology. EXSdetect was developed mainly to exploit the ever-increasing wealth of archival X-ray data, but is also ideally suited to explore the scientific capabilities of future X-ray facilities, with a strong focus on investigations of distant groups and clusters of galaxies. Methods: EXSdetect combines a fast Voronoi tessellation code with a friends-of-friends algorithm and an automated deblending procedure. The values of key parameters are matched to fundamental telescope properties such as angular resolution and instrumental background. In addition, the software is designed to permit extensive tests of its performance via simulations of a wide range of observational scenarios. Results: We applied EXSdetect to simulated data fields modeled to realistically represent the Swift X-ray Cluster Survey (SXCS), which is based on archival data obtained by the X-ray telescope onboard the Swift satellite. We achieve more than 90% completeness for extended sources comprising at least 80 photons in the 0.5-2 keV band, a limit that corresponds to 10-14 erg cm-2 s-1 for the deepest SXCS fields. This detection limit is comparable to the one attained by the most sensitive cluster surveys conducted with much larger X-ray telescopes. While evaluating the performance of EXSdetect, we also explored the impact of improved angular resolution and discuss the ideal properties of the next generation of X-ray survey missions. The Phyton code EXSdetect is available on the SXCS website http://adlibitum.oats.inaf.it/sxcs

  15. 'End to end' planktonic trophic web and its implications for the mussel farms in the Mar Piccolo of Taranto (Ionian Sea, Italy).

    PubMed

    Karuza, Ana; Caroppo, Carmela; Monti, Marina; Camatti, Elisa; Di Poi, Elena; Stabili, Loredana; Auriemma, Rocco; Pansera, Marco; Cibic, Tamara; Del Negro, Paola

    2016-07-01

    The Mar Piccolo is a semi-enclosed basin subject to different natural and anthropogenic stressors. In order to better understand plankton dynamics and preferential carbon pathways within the planktonic trophic web, an integrated approach was adopted for the first time by examining all trophic levels (virioplankton, the heterotrophic and phototrophic fractions of pico-, nano- and microplankton, as well as mesozooplankton). Plankton abundance and biomass were investigated during four surveys in the period 2013-2014. Beside unveiling the dynamics of different plankton groups in the Mar Piccolo, the study revealed that high portion of the plankton carbon (C) pool was constituted by small-sized (<2 μm) planktonic fractions. The prevalence of small-sized species within micro- and mesozooplankton communities was observed as well. The succession of planktonic communities was clearly driven by the seasonality, i.e. by the nutrient availability and physical features of the water column. Our hypothesis is that beside the 'bottom-up' control and the grazing pressure, inferred from the C pools of different plankton groups, the presence of mussel farms in the Mar Piccolo exerts a profound impact on plankton communities, not only due to the important sequestration of the plankton biomass but also by strongly influencing its structure. PMID:26498814

  16. End-to-End System Test of the Relative Precision and Stability of the Photometric Method for Detecting Earth-Size Extrasolar Planets

    NASA Technical Reports Server (NTRS)

    Dunham, Edward W.

    2000-01-01

    We developed the CCD camera system for the laboratory test demonstration and designed the optical system for this test. The camera system was delivered to Ames in April, 1999 with continuing support mostly in the software area as the test progressed. The camera system has been operating successfully since delivery. The optical system performed well during the test. The laboratory demonstration activity is now nearly complete and is considered to be successful by the Technical Advisory Group, which met on 8 February, 2000 at the SETI Institute. A final report for the Technical Advisory Group and NASA Headquarters will be produced in the next few months. This report will be a comprehensive report on all facets of the test including those covered under this grant. A copy will be forwarded, if desired, when it is complete.

  17. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    SciTech Connect

    Huang, L; Sarkar, V; Spiessens, S; Rassiah-Szegedi, P; Huang, Y; Salter, B; Zhao, H; Szegedi, M

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  18. End-to-end connectivity utility-based two-way network offers wireless option for home and business information services

    SciTech Connect

    Barth, S.

    1997-06-01

    Faster. Cheaper. Better. That`s the battle cry for utility and communications industries as information technologies merge and a new laissez faire attitude toward competition emerges from regulatory agencies. Telecommunications and cable companies loom as likely contenders to capitalize on these changes. However, with equal, perhaps better access to consumer bases, utility companies are positioning themselves as competitors on this newly leveled playing field. The playing field has not only been leveled, but is now vastly expanded. The utility company used to view its principle requirement as the ability to meet the needs of its existing service territory. Now to survive in the age of a newly competitive arena, the utility company must not only look at methods for maintaining its existing customers, but also at how to acquire customers outside its service territory. The deregulation of the telecom industry has provided the utility company with some additional breathing room in establishing strategies and methods for how it will maintain its customers. While the major telecommunications players continue to battle each other for the opportunity to provide both local and long distance service, utility companies have gained a bit of time to enable them to develop the services that can be offered directly to their existing customer base via power lines or other existing utility infrastructure. By integrating commercial wireless communications services with their existing power lines, electric companies are insulating themselves against competitors in their own field, as well as making new services available to their customers. With applications such as automatic meter reading and service monitoring capabilities for themselves and value-added services from outside vendors for their consumers, utilities can carve out a strong niche in their existing marketplace while moving into new territory.

  19. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    SciTech Connect

    Lin, M; Feigenberg, S

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  20. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  1. Orthogonal labeling of M13 minor capsid proteins with DNA to self-assemble end-to-end multi-phage structures

    PubMed Central

    Hess, Gaelen T.; Guimaraes, Carla P.; Spooner, Eric; Ploegh, Hidde L.; Belcher, Angela M.

    2014-01-01

    M13 bacteriophage has been used as a scaffold to organize materials for various applications. Building more complex multi-phage devices requires precise control of interactions between the M13 capsid proteins. Towards this end, we engineered a loop structure onto the pIII capsid protein of M13 bacteriophage to enable sortase-mediated labeling reactions for C-terminal display. Combining this with N-terminal sortase-mediated labeling, we thus created a phage scaffold that can be labeled orthogonally on three capsid proteins: the body and both ends. We show that covalent attachment of different DNA oligonucleotides at the ends of the new phage structure enables formation of multi-phage particles oriented in a specific order. These have potential as nanoscale scaffolds for multi–material devices. PMID:23713956

  2. Niti CAR 27 Versus a Conventional End-to-End Anastomosis Stapler in a Laparoscopic Anterior Resection for Sigmoid Colon Cancer

    PubMed Central

    Kwag, Seung-Jin; Kim, Jun-Gi; Kang, Won-Kyung; Lee, Jin-Kwon

    2014-01-01

    Purpose The Niti CAR 27 (ColonRing) uses compression to create an anastomosis. This study aimed to investigate the safety and the effectiveness of the anastomosis created with the Niti CAR 27 in a laparoscopic anterior resection for sigmoid colon cancer. Methods In a single-center study, 157 consecutive patients who received an operation between March 2010 and December 2011 were retrospectively assessed. The Niti CAR 27 (CAR group, 63 patients) colorectal anastomoses were compared with the conventional double-stapled (CDS group, 94 patients) colorectal anastomoses. Intraoperative, immediate postoperative and 6-month follow-up data were recorded. Results There were no statistically significant differences between the two groups in terms of age, gender, tumor location and other clinical characteristics. One patient (1.6%) in the CAR group and 2 patients (2.1%) in the CDS group experienced complications of anastomotic leakage (P = 0.647). These three patients underwent a diverting loop ileostomy. There were 2 cases (2.1%) of bleeding at the anastomosis site in the CDS group. All patients underwent a follow-up colonoscopy (median, 6 months). One patient in the CAR group experienced anastomotic stricture (1.6% vs. 0%; P = 0.401). This complication was solved by using balloon dilatation. Conclusion Anastomosis using the Niti CAR 27 device in a laparoscopic anterior resection for sigmoid colon cancer is safe and feasible. Its use is equivalent to that of the conventional double-stapler. PMID:24851217

  3. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management †

    PubMed Central

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-01-01

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario. PMID:26087372

  4. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    PubMed

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-01-01

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario. PMID:26087372

  5. Outage Performance Analysis of Relay Selection Schemes in Wireless Energy Harvesting Cooperative Networks over Non-Identical Rayleigh Fading Channels †

    PubMed Central

    Do, Nhu Tri; Bao, Vo Nguyen Quoc; An, Beongku

    2016-01-01

    In this paper, we study relay selection in decode-and-forward wireless energy harvesting cooperative networks. In contrast to conventional cooperative networks, the relays harvest energy from the source’s radio-frequency radiation and then use that energy to forward the source information. Considering power splitting receiver architecture used at relays to harvest energy, we are concerned with the performance of two popular relay selection schemes, namely, partial relay selection (PRS) scheme and optimal relay selection (ORS) scheme. In particular, we analyze the system performance in terms of outage probability (OP) over independent and non-identical (i.n.i.d.) Rayleigh fading channels. We derive the closed-form approximations for the system outage probabilities of both schemes and validate the analysis by the Monte-Carlo simulation. The numerical results provide comprehensive performance comparison between the PRS and ORS schemes and reveal the effect of wireless energy harvesting on the outage performances of both schemes. Additionally, we also show the advantages and drawbacks of the wireless energy harvesting cooperative networks and compare to the conventional cooperative networks. PMID:26927119

  6. Minimally invasive video-assisted versus minimally invasive nonendoscopic thyroidectomy.

    PubMed

    Fík, Zdeněk; Astl, Jaromír; Zábrodský, Michal; Lukeš, Petr; Merunka, Ilja; Betka, Jan; Chovanec, Martin

    2014-01-01

    Minimally invasive video-assisted thyroidectomy (MIVAT) and minimally invasive nonendoscopic thyroidectomy (MINET) represent well accepted and reproducible techniques developed with the main goal to improve cosmetic outcome, accelerate healing, and increase patient's comfort following thyroid surgery. Between 2007 and 2011, a prospective nonrandomized study of patients undergoing minimally invasive thyroid surgery was performed to compare advantages and disadvantages of the two different techniques. There were no significant differences in the length of incision to perform surgical procedures. Mean duration of hemithyroidectomy was comparable in both groups, but it was more time consuming to perform total thyroidectomy by MIVAT. There were more patients undergoing MIVAT procedures without active drainage in the postoperative course and we also could see a trend for less pain in the same group. This was paralleled by statistically significant decreased administration of both opiates and nonopiate analgesics. We encountered two cases of recurrent laryngeal nerve palsies in the MIVAT group only. MIVAT and MINET represent safe and feasible alternative to conventional thyroid surgery in selected cases and this prospective study has shown minimal differences between these two techniques. PMID:24800227

  7. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  8. Minimally invasive aortic valve surgery.

    PubMed

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-09-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  9. LLNL Waste Minimization Program Plan

    SciTech Connect

    Not Available

    1990-02-14

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs.

  10. WASTE MINIMIZATION OPPORTUNITY ASSESSMENT MANUAL

    EPA Science Inventory

    Waste minimization (WM) is a policy specifically mandated by the U.S. Congress in the 1984 Hazardous and Solid Wastes Amendments to the Resource Conservation and Recovery Act (RCRA). The RCRA regulations require that generators of hazardous waste have a program in place to reduce...

  11. Assembly of a minimal protocell

    NASA Astrophysics Data System (ADS)

    Rasmussen, Steen

    2007-03-01

    What is minimal life, how can we make it, and how can it be useful? We present experimental and computational results towards bridging nonliving and living matter, which results in life that is different and much simpler than contemporary life. A simple yet tightly coupled catalytic cooperation between genes, metabolism, and container forms the design underpinnings of our protocell, which is a minimal self-replicating molecular machine. Experimentally, we have recently demonstrated this coupling by having an informational molecule (8-oxoguanine) catalytically control the light driven metabolic (Ru-bpy based) production of container materials (fatty acids). This is a significant milestone towards assembling a minimal self-replicating molecular machine. Recent theoretical investigations indicate that coordinated exponential component growth should naturally emerge as a result from such a catalytic coupling between the main protocellular components. A 3-D dissipative particle simulation (DPD) study of the full protocell life-cycle exposes a number of anticipated systemic issues associated with the remaining experimental challenges for the implementation of the minimal protocell. Finally we outline how more general self-replicating materials could be useful.

  12. A Defense of Semantic Minimalism

    ERIC Educational Resources Information Center

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  13. Minimally invasive aortic valve surgery

    PubMed Central

    Castrovinci, Sebastiano; Emmanuel, Sam; Moscarelli, Marco; Murana, Giacomo; Caccamo, Giuseppa; Bertolino, Emanuela Clara; Nasso, Giuseppe; Speziale, Giuseppe; Fattouch, Khalil

    2016-01-01

    Aortic valve disease is a prevalent disorder that affects approximately 2% of the general adult population. Surgical aortic valve replacement is the gold standard treatment for symptomatic patients. This treatment has demonstrably proven to be both safe and effective. Over the last few decades, in an attempt to reduce surgical trauma, different minimally invasive approaches for aortic valve replacement have been developed and are now being increasingly utilized. A narrative review of the literature was carried out to describe the surgical techniques for minimally invasive aortic valve surgery and report the results from different experienced centers. Minimally invasive aortic valve replacement is associated with low perioperative morbidity, mortality and a low conversion rate to full sternotomy. Long-term survival appears to be at least comparable to that reported for conventional full sternotomy. Minimally invasive aortic valve surgery, either with a partial upper sternotomy or a right anterior minithoracotomy provides early- and long-term benefits. Given these benefits, it may be considered the standard of care for isolated aortic valve disease. PMID:27582764

  14. Toward a Minimal Artificial Axon.

    PubMed

    Ariyaratne, Amila; Zocchi, Giovanni

    2016-07-01

    The electrophysiology of action potentials is usually studied in neurons, through relatively demanding experiments which are difficult to scale up to a defined network. Here we pursue instead the minimal artificial system based on the essential biological components-ion channels and lipid bilayers-where action potentials can be generated, propagated, and eventually networked. The fundamental unit is the classic supported bilayer: a planar bilayer patch with embedded ion channels in a fluidic environment where an ionic gradient is imposed across the bilayer. Two such units electrically connected form the basic building block for a network. The system is minimal in that we demonstrate that one kind of ion channel and correspondingly a gradient of only one ionic species is sufficient to generate an excitable system which shows amplification and threshold behavior. PMID:27049652

  15. Minimal Doubling and Point Splitting

    SciTech Connect

    Creutz, M.

    2010-06-14

    Minimally-doubled chiral fermions have the unusual property of a single local field creating two fermionic species. Spreading the field over hypercubes allows construction of combinations that isolate specific modes. Combining these fields into bilinears produces meson fields of specific quantum numbers. Minimally-doubled fermion actions present the possibility of fast simulations while maintaining one exact chiral symmetry. They do, however, introduce some peculiar aspects. An explicit breaking of hyper-cubic symmetry allows additional counter-terms to appear in the renormalization. While a single field creates two different species, spreading this field over nearby sites allows isolation of specific states and the construction of physical meson operators. Finally, lattice artifacts break isospin and give two of the three pseudoscalar mesons an additional contribution to their mass. Depending on the sign of this mass splitting, one can either have a traditional Goldstone pseudoscalar meson or a parity breaking Aoki-like phase.

  16. Anaesthesia for minimally invasive surgery

    PubMed Central

    Dec, Marta

    2015-01-01

    Minimally invasive surgery (MIS) is rising in popularity. It offers well-known benefits to the patient. However, restricted access to the surgical site and gas insufflation into the body cavities may result in severe complications. From the anaesthetic point of view MIS poses unique challenges associated with creation of pneumoperitoneum, carbon dioxide absorption, specific positioning and monitoring a patient to whom the anaesthetist has often restricted access, in a poorly lit environment. Moreover, with refinement of surgical procedures and growing experience the anaesthetist is presented with patients from high-risk groups (obese, elderly, with advanced cardiac and respiratory disease) who once were deemed unsuitable for the laparoscopic technique. Anaesthetic management is aimed at getting the patient safely through the procedure, minimizing the specific risks arising from laparoscopy and the patient's coexisting medical problems, ensuring quick recovery and a relatively pain-free postoperative course with early return to normal function. PMID:26865885

  17. Minimal universal quantum heat machine.

    PubMed

    Gelbwaser-Klimovsky, D; Alicki, R; Kurizki, G

    2013-01-01

    In traditional thermodynamics the Carnot cycle yields the ideal performance bound of heat engines and refrigerators. We propose and analyze a minimal model of a heat machine that can play a similar role in quantum regimes. The minimal model consists of a single two-level system with periodically modulated energy splitting that is permanently, weakly, coupled to two spectrally separated heat baths at different temperatures. The equation of motion allows us to compute the stationary power and heat currents in the machine consistent with the second law of thermodynamics. This dual-purpose machine can act as either an engine or a refrigerator (heat pump) depending on the modulation rate. In both modes of operation, the maximal Carnot efficiency is reached at zero power. We study the conditions for finite-time optimal performance for several variants of the model. Possible realizations of the model are discussed. PMID:23410316

  18. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)]. PMID:26382367

  19. Principle of minimal work fluctuations

    NASA Astrophysics Data System (ADS)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality =e-β Δ F , a change in the fluctuations of e-β W may impact how rapidly the statistical average of e-β W converges towards the theoretical value e-β Δ F, where W is the work, β is the inverse temperature, and Δ F is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-β W. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-β W, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-β W. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014), 10.1103/PhysRevE.90.052132].

  20. Minimizing liability during internal investigations.

    PubMed

    Morris, Cole

    2010-01-01

    Today's security professional must appreciate the potential landmines in any investigative effort and work collaboratively with others to minimize liability risks, the author points out. In this article he examines six civil torts that commonly arise from unprofessionally planned or poorly executed internal investigations-defamation, false imprisonment. intentional infliction of emotional distress, assault and battery, invasion of privacy, and malicious prosecution and abuse of process. PMID:20873494

  1. Risk minimization through portfolio replication

    NASA Astrophysics Data System (ADS)

    Ciliberti, S.; Mã©Zard, M.

    2007-05-01

    We use a replica approach to deal with portfolio optimization problems. A given risk measure is minimized using empirical estimates of asset values correlations. We study the phase transition which happens when the time series is too short with respect to the size of the portfolio. We also study the noise sensitivity of portfolio allocation when this transition is approached. We consider explicitely the cases where the absolute deviation and the conditional value-at-risk are chosen as a risk measure. We show how the replica method can study a wide range of risk measures, and deal with various types of time series correlations, including realistic ones with volatility clustering.

  2. Diagnosis of minimal hepatic encephalopathy.

    PubMed

    Weissenborn, Karin

    2015-03-01

    Minimal hepatic encephalopathy (mHE) has significant impact upon a liver patient's daily living and health related quality of life. Therefore a majority of clinicians agree that mHE should be diagnosed and treated. The optimal means for diagnosing mHE, however, is controversial. This paper describes the currently most frequently used methods-EEG, critical flicker frequency, Continuous Reaction time Test, Inhibitory Control Test, computerized test batteries such as the Cognitive Drug Research test battery, the psychometric hepatic encephalopathy score (PHES) and the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS)-and their pros and cons. PMID:26041959

  3. About the ZOOM minimization package

    SciTech Connect

    Fischler, M.; Sachs, D.; /Fermilab

    2004-11-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete.

  4. Prepulse minimization in KALI-5000.

    PubMed

    Kumar, D Durga Praveen; Mitra, S; Senthil, K; Sharma, Vishnu K; Singh, S K; Roy, A; Sharma, Archana; Nagesh, K V; Chakravarthy, D P

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given. PMID:19655979

  5. Prepulse minimization in KALI-5000

    NASA Astrophysics Data System (ADS)

    Kumar, D. Durga Praveen; Mitra, S.; Senthil, K.; Sharma, Vishnu K.; Singh, S. K.; Roy, A.; Sharma, Archana; Nagesh, K. V.; Chakravarthy, D. P.

    2009-07-01

    A pulse power system (1 MV, 50 kA, and 100 ns) based on Marx generator and Blumlein pulse forming line has been built for generating high power microwaves. The Blumlein configuration poses a prepulse problem and hence the diode gap had to be increased to match the diode impedance to the Blumlein impedance during the main pulse. A simple method to eliminate prepulse voltage using a vacuum sparkgap and a resistor is given. Another fundamental approach of increasing the inductance of Marx generator to minimize the prepulse voltage is also presented. Experimental results for both of these configurations are given.

  6. Minimizing medical litigation, part 2.

    PubMed

    Harold, Tan Keng Boon

    2006-01-01

    Provider-patient disputes are inevitable in the healthcare sector. Healthcare providers and regulators should recognize this and plan opportunities to enforce alternative dispute resolution (ADR) a early as possible in the care delivery process. Negotiation is often the main dispute resolution method used by local healthcare providers, failing which litigation would usually follow. The role of mediation in resolving malpractice disputes has been minimal. Healthcare providers, administrators, and regulators should therefore look toward a post-event communication-cum-mediation framework as the key national strategy to resolving malpractice disputes. PMID:16711089

  7. The minimal scenario of leptogenesis

    NASA Astrophysics Data System (ADS)

    Blanchet, Steve; Di Bari, Pasquale

    2012-12-01

    We review the main features and results of thermal leptogenesis within the type I seesaw mechanism, the minimal extension of the Standard Model explaining neutrino masses and mixing. After presenting the simplest approach, the vanilla scenario, we discuss various important developments of recent years, such as the inclusion of lepton and heavy neutrino flavour effects, a description beyond a hierarchical heavy neutrino mass spectrum and an improved kinetic description within the density matrix and the closed-time-path formalisms. We also discuss how leptogenesis can ultimately represent an important phenomenological tool to test the seesaw mechanism and the underlying model of new physics.

  8. Minimizing travel claims cost with minimal-spanning tree model

    NASA Astrophysics Data System (ADS)

    Jamalluddin, Mohd Helmi; Jaafar, Mohd Azrul; Amran, Mohd Iskandar; Ainul, Mohd Sharizal; Hamid, Aqmar; Mansor, Zafirah Mohd; Nopiah, Zulkifli Mohd

    2014-06-01

    Travel demand necessitates a big expenditure in spending, as has been proven by the National Audit Department (NAD). Every year the auditing process is carried out throughout the country involving official travel claims. This study focuses on the use of the Spanning Tree model to determine the shortest path to minimize the cost of the NAD's official travel claims. The objective is to study the possibility of running a network based in the Kluang District Health Office to eight Rural Clinics in Johor state using the Spanning Tree model applications for optimizing travelling distances and make recommendations to the senior management of the Audit Department to analyze travelling details before an audit is conducted. Result of this study reveals that there were claims of savings of up to 47.4% of the original claims, over the course of the travel distance.

  9. Annual Waste Minimization Summary Report

    SciTech Connect

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO.

  10. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  11. Symmetry breaking for drag minimization

    NASA Astrophysics Data System (ADS)

    Roper, Marcus; Squires, Todd M.; Brenner, Michael P.

    2005-11-01

    For locomotion at high Reynolds numbers drag minimization favors fore-aft asymmetric slender shapes with blunt noses and sharp trailing edges. On the other hand, in an inertialess fluid the drag experienced by a body is independent of whether it travels forward or backward through the fluid, so there is no advantage to having a single preferred swimming direction. In fact numerically determined minimum drag shapes are known to exhibit almost no fore-aft asymmetry even at moderate Re. We show that asymmetry persists, albeit extremely weakly, down to vanishingly small Re, scaling asymptotically as Re^3. The need to minimize drag to maximize speed for a given propulsive capacity gives one possible mechanism for the increasing asymmetry in the body plans seen in nature, as organisms increase in size and swimming speed from bacteria like E-Coli up to pursuit predator fish such as tuna. If it is the dominant mechanism, then this signature scaling will be observed in the shapes of motile micro-organisms.

  12. Next-to-minimal SOFTSUSY

    NASA Astrophysics Data System (ADS)

    Allanach, B. C.; Athron, P.; Tunstall, Lewis C.; Voigt, A.; Williams, A. G.

    2014-09-01

    We describe an extension to the SOFTSUSY program that provides for the calculation of the sparticle spectrum in the Next-to-Minimal Supersymmetric Standard Model (NMSSM), where a chiral superfield that is a singlet of the Standard Model gauge group is added to the Minimal Supersymmetric Standard Model (MSSM) fields. Often, a Z3 symmetry is imposed upon the model. SOFTSUSY can calculate the spectrum in this case as well as the case where general Z3 violating (denoted as =) terms are added to the soft supersymmetry breaking terms and the superpotential. The user provides a theoretical boundary condition for the couplings and mass terms of the singlet. Radiative electroweak symmetry breaking data along with electroweak and CKM matrix data are used as weak-scale boundary conditions. The renormalisation group equations are solved numerically between the weak scale and a high energy scale using a nested iterative algorithm. This paper serves as a manual to the NMSSM mode of the program, detailing the approximations and conventions used. Catalogue identifier: ADPM_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADPM_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 154886 No. of bytes in distributed program, including test data, etc.: 1870890 Distribution format: tar.gz Programming language: C++, fortran. Computer: Personal computer. Operating system: Tested on Linux 3.x. Word size: 64 bits Classification: 11.1, 11.6. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: ADPM_v3_0 Journal reference of previous version: Comput. Phys. Comm. 183 (2012) 785 Nature of problem: Calculating supersymmetric particle spectrum and mixing parameters in the next-to-minimal supersymmetric standard model. The solution to the

  13. Proxies to GNSS signal outages from irregularity dynamics around the northern crest of the Equatorial Ionization Anomaly

    NASA Astrophysics Data System (ADS)

    Das, Tanmay; Paul, Ashik

    proportion of local post-sunset to midnight hours during equinoctial months. After the unusually prolonged bottom of the solar cycle spanning 2006-2010, scintillation activity dramatically picked up during the equinoxes of 2011. While the autumnal equinox of 2010 witnessed only a few cases of intense VHF scintillations, 47 cases of intense (S4>0.6) were recorded at VHF during the vernal equinox of 2011 and 38 cases of intense (S4>0.6) L band scintillations. Dual-frequency GPS data at recorded at Calcutta and another station, Siliguri (26.72°N, 88.39°E geographic lat; 40°N magnetic dip), situated beyond the northern crest of the EIA have been used. For the analysis, GPS satellites observed from Calcutta with 350-km subionospheric points lying within latitude swath of 20.12°-22.12°N and longitude 86.25°-88.25°E were selected around that of FSC. It was found that periods of intense L-band scintillations corresponded to low decorrelation times at VHF and abrupt fluctuations in PDOP indicating significant compromise of navigational accuracy. A causal understanding behind GNSS tracking errors are being attempted using VHF measurements, and efforts are being made to identify a threshold decorrelation time which results in GNSS signal outages exceeding the specifications of International Civil Aviation Organization (ICAO).

  14. Update on designing and building minimal cells

    PubMed Central

    Jewett, Michael C.; Forster, Anthony C.

    2010-01-01

    Summary Minimal cells comprise only the genes and biomolecular machinery necessary for basic life. Synthesizing minimal and minimized cells will improve understanding of core biology, enhance development of biotechnology strains of bacteria, and enable evolutionary optimization of natural and unnatural biopolymers. Design and construction of minimal cells is proceeding in two different directions: “top-down” reduction of bacterial genomes in vivo and “bottom-up” integration of DNA/RNA/protein/membrane syntheses in vitro. Major progress in the last 5 years has occurred in synthetic genomics, minimization of the Escherichia coli genome, sequencing of minimal bacterial endosymbionts, identification of essential genes, and integration of biochemical systems. PMID:20638265

  15. [MINIMALLY INVASIVE AORTIC VALVE REPLACEMENT].

    PubMed

    Tabata, Minoru

    2016-03-01

    Minimally invasive aortic valve replacement (MIAVR) is defined as aortic valve replacement avoiding full sternotomy. Common approaches include a partial sternotomy right thoracotomy, and a parasternal approach. MIAVR has been shown to have advantages over conventional AVR such as shorter length of stay and smaller amount of blood transfusion and better cosmesis. However, it is also known to have disadvantages such as longer cardiopulmonary bypass and aortic cross-clamp times and potential complications related to peripheral cannulation. Appropriate patient selection is very important. Since the procedure is more complex than conventional AVR, more intensive teamwork in the operating room is essential. Additionally, a team approach during postoperative management is critical to maximize the benefits of MIAVR. PMID:27295772

  16. Non-minimal Inflationary Attractors

    SciTech Connect

    Kallosh, Renata; Linde, Andrei E-mail: alinde@stanford.edu

    2013-10-01

    Recently we identified a new class of (super)conformally invariant theories which allow inflation even if the scalar potential is very steep in terms of the original conformal variables. Observational predictions of a broad class of such theories are nearly model-independent. In this paper we consider generalized versions of these models where the inflaton has a non-minimal coupling to gravity with a negative parameter ξ different from its conformal value -1/6. We show that these models exhibit attractor behavior. With even a slight increase of |ξ| from |ξ| = 0, predictions of these models for n{sub s} and r rapidly converge to their universal model-independent values corresponding to conformal coupling ξ = −1/6. These values of n{sub s} and r practically coincide with the corresponding values in the limit ξ → −∞.

  17. Strategies to Minimize Antibiotic Resistance

    PubMed Central

    Lee, Chang-Ro; Cho, Ill Hwan; Jeong, Byeong Chul; Lee, Sang Hee

    2013-01-01

    Antibiotic resistance can be reduced by using antibiotics prudently based on guidelines of antimicrobial stewardship programs (ASPs) and various data such as pharmacokinetic (PK) and pharmacodynamic (PD) properties of antibiotics, diagnostic testing, antimicrobial susceptibility testing (AST), clinical response, and effects on the microbiota, as well as by new antibiotic developments. The controlled use of antibiotics in food animals is another cornerstone among efforts to reduce antibiotic resistance. All major resistance-control strategies recommend education for patients, children (e.g., through schools and day care), the public, and relevant healthcare professionals (e.g., primary-care physicians, pharmacists, and medical students) regarding unique features of bacterial infections and antibiotics, prudent antibiotic prescribing as a positive construct, and personal hygiene (e.g., handwashing). The problem of antibiotic resistance can be minimized only by concerted efforts of all members of society for ensuring the continued efficiency of antibiotics. PMID:24036486

  18. Minimizing the pain on burnout

    SciTech Connect

    Billings, A.

    1985-03-01

    An investment in an oil and gas shelter warrants an additional investment to fund tax liability on burnout. A relatively liquid and low-risk investment is preferable so as to assure timely satisfaction of tax liability when burnout occurs. If an investor decides to allow the shelter to die a timely death, the investment funds could be used to fund annual tax liability. In situations where a leak develops, the fund will once again be invaluable. When a leak or burnout occurs, investors may be able to do no more than minimize their maximum losses. Relief of debt on most dispositions will be deemed receipt of cash, thus triggering gains. Ordinary income will result by operation of Code Sections 1245, 1250, and 1254. Bankruptcy or a charitable contribution will grant limited reprieve from tax losses; however, economic losses will still result.

  19. Minimal unitary (covariant) scattering theory

    SciTech Connect

    Lindesay, J.V.; Markevich, A.

    1983-06-01

    In the minimal three particle equations developed by Lindesay the two body input amplitude was an on shell relativistic generalization of the non-relativistic scattering model characterized by a single mass parameter ..mu.. which in the two body (m + m) system looks like an s-channel bound state (..mu.. < 2m) or virtual state (..mu.. > 2m). Using this driving term in covariant Faddeev equations generates a rich covariant and unitary three particle dynamics. However, the simplest way of writing the relativisitic generalization of the Faddeev equations can take the on shell Mandelstam parameter s = 4(q/sup 2/ + m/sup 2/), in terms of which the two particle input is expressed, to negative values in the range of integration required by the dynamics. This problem was met in the original treatment by multiplying the two particle input amplitude by THETA(s). This paper provides what we hope to be a more direct way of meeting the problem.

  20. Minimally packed phases in holography

    NASA Astrophysics Data System (ADS)

    Donos, Aristomenis; Gauntlett, Jerome P.

    2016-03-01

    We numerically construct asymptotically AdS black brane solutions of D = 4 Einstein-Maxwell theory coupled to a pseudoscalar. The solutions are holographically dual to d = 3 CFTs at finite chemical potential and in a constant magnetic field, which spontaneously break translation invariance leading to the spontaneous formation of abelian and momentum magnetisation currents flowing around the plaquettes of a periodic Bravais lattice. We analyse the three-dimensional moduli space of lattice solutions, which are generically oblique, and show, for a specific value of the magnetic field, that the free energy is minimised by the triangular lattice, associated with minimal packing of circles in the plane. We show that the average stress tensor for the thermodynamically preferred phase is that of a perfect fluid and that this result applies more generally to spontaneously generated periodic phases. The triangular structure persists at low temperatures indicating the existence of novel crystalline ground states.

  1. The minimal composite Higgs model

    NASA Astrophysics Data System (ADS)

    Agashe, Kaustubh; Contino, Roberto; Pomarol, Alex

    2005-07-01

    We study the idea of a composite Higgs in the framework of a five-dimensional AdS theory. We present the minimal model of the Higgs as a pseudo-Goldstone boson in which electroweak symmetry is broken dynamically via top loop effects, all flavour problems are solved, and contributions to electroweak precision observables are below experimental bounds. Since the 5D theory is weakly coupled, we are able to fully determine the Higgs potential and other physical quantities. The lightest resonances are expected to have a mass around 2 TeV and should be discovered at the LHC. The top sector is mostly composite and deviations from Standard Model couplings are expected.

  2. Minimally invasive posterior hamstring harvest.

    PubMed

    Wilson, Trent J; Lubowitz, James H

    2013-01-01

    Autogenous hamstring harvesting for knee ligament reconstruction is a well-established standard. Minimally invasive posterior hamstring harvest is a simple, efficient, reproducible technique for harvest of the semitendinosus or gracilis tendon or both medial hamstring tendons. A 2- to 3-cm longitudinal incision from the popliteal crease proximally, in line with the semitendinosus tendon, is sufficient. The deep fascia is bluntly penetrated, and the tendon or tendons are identified. Adhesions are dissected. Then, an open tendon stripper is used to release the tendon or tendons proximally; a closed, sharp tendon stripper is used to release the tendon or tendons from the pes. Layered, absorbable skin closure is performed, and the skin is covered with a skin sealant, bolster dressing, and plastic adhesive bandage for 2 weeks. PMID:24266003

  3. Minimally Invasive Spigelian Hernia Repair

    PubMed Central

    Baucom, Catherine; Nguyen, Quan D.; Hidalgo, Marco

    2009-01-01

    Introduction: Spigelian hernia is an uncommon ventral hernia characterized by a defect in the linea semilunaris. Repair of spigelian hernia has traditionally been accomplished via an open transverse incision and primary repair. The purpose of this article is to present 2 case reports of incarcerated spigelian hernia that were successfully repaired laparoscopically using Gortex mesh and to present a review of the literature regarding laparoscopic repair of spigelian hernias. Methods: Retrospective chart review and Medline literature search. Results: Two patients underwent laparoscopic mesh repair of incarcerated spigelian hernias. Both were started on a regular diet on postoperative day 1 and discharged on postoperative days 2 and 3. One patient developed a seroma that resolved without intervention. There was complete resolution of preoperative symptoms at the 12-month follow-up. Conclusion: Minimally invasive repair of spigelian hernias is an alternative to the traditional open surgical technique. Further studies are needed to directly compare the open and the laparoscopic repair. PMID:19660230

  4. A minimal little Higgs model

    NASA Astrophysics Data System (ADS)

    Barceló, Roberto; Masip, Manuel

    2008-11-01

    We discuss a little Higgs scenario that introduces below the TeV scale just the two minimal ingredients of these models, a vectorlike T quark and a singlet component (implying anomalous couplings) in the Higgs field, together with a pseudoscalar singlet η. In the model, which is a variation of Schmaltz’s simplest little Higgs model, all the extra vector bosons are much heavier than the T quark. In the Yukawa sector the global symmetry is approximate, implying a single large coupling per flavor, whereas in the scalar sector it is only broken at the loop level. We obtain the one-loop effective potential and show that it provides acceptable masses for the Higgs h and for the singlet η with no need for an extra μ term. We find that mη can be larger than mh/2, which would forbid the (otherwise dominant) decay mode h→ηη.

  5. Natural supersymmetric minimal dark matter

    NASA Astrophysics Data System (ADS)

    Fabbrichesi, Marco; Urbano, Alfredo

    2016-03-01

    We show how the Higgs boson mass is protected from the potentially large corrections due to the introduction of minimal dark matter if the new physics sector is made supersymmetric. The fermionic dark matter candidate (a 5-plet of S U (2 )L) is accompanied by a scalar state. The weak gauge sector is made supersymmetric, and the Higgs boson is embedded in a supersymmetric multiplet. The remaining standard model states are nonsupersymmetric. Nonvanishing corrections to the Higgs boson mass only appear at three-loop level, and the model is natural for dark matter masses up to 15 TeV—a value larger than the one required by the cosmological relic density. The construction presented stands as an example of a general approach to naturalness that solves the little hierarchy problem which arises when new physics is added beyond the standard model at an energy scale around 10 TeV.

  6. Chemical basis for minimal cognition.

    PubMed

    Hanczyc, Martin M; Ikegami, Takashi

    2010-01-01

    We have developed a simple chemical system capable of self-movement in order to study the physicochemical origins of movement. We propose how this system may be useful in the study of minimal perception and cognition. The system consists simply of an oil droplet in an aqueous environment. A chemical reaction within the oil droplet induces an instability, the symmetry of the oil droplet breaks, and the droplet begins to move through the aqueous phase. The complement of physical phenomena that is then generated indicates the presence of feedback cycles that, as will be argued, form the basis for self-regulation, homeostasis, and perhaps an extended form of autopoiesis. We discuss the result that simple chemical systems are capable of sensory-motor coupling and possess a homeodynamic state from which cognitive processes may emerge. PMID:20586578

  7. Minimally invasive radioguided parathyroidectomy (MIRP).

    PubMed

    Goldstein, R E; Martin, W H; Richards, K

    2003-06-01

    The technique of parathyroidectomy has traditionally involved a bilateral exploration of the neck with the intent of visualizing 4 parathyroid glands and resecting pathologically enlarged glands. Parathyroid scanning using technetium-99m sestamibi has evolved and can now localize 80% to 90% of parathyroid adenomas. The technique of minimally invasive radioguided parathyroidectomy (MIRP) is a surgical option for most patients with primary hyperparathyroidism and a positive preoperative parathyroid scan. The technique makes use of a hand-held gamma probe that is used intraoperatively to guide the dissection in a highly directed manner with the procedure often performed under local anesthesia. The technique results in excellent cure rates while allowing most patients to leave the hospital within a few hours after the completion of the procedure. Current data also suggest the procedure can decrease hospital charges by approximately 50%. This technique may significantly change the management of primary hyperparathyroidism. PMID:12955045

  8. Waste minimization in analytical methods

    SciTech Connect

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-05-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department`s goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision.

  9. Minimal length uncertainty and accelerating universe

    NASA Astrophysics Data System (ADS)

    Farmany, A.; Mortazavi, S. S.

    2016-06-01

    In this paper, minimal length uncertainty is used as a constraint to solve the Friedman equation. It is shown that, based on the minimal length uncertainty principle, the Hubble scale is decreasing which corresponds to an accelerating universe.

  10. Closed locally minimal nets on tetrahedra

    SciTech Connect

    Strelkova, Nataliya P

    2011-01-31

    Closed locally minimal networks are in a sense a generalization of closed geodesics. A complete classification is known of closed locally minimal networks on regular (and generally any equihedral) tetrahedra. In the present paper certain necessary and certain sufficient conditions are given for at least one closed locally minimal network to exist on a given non-equihedral tetrahedron. Bibliography: 6 titles.

  11. Mini-Med School Planning Guide

    ERIC Educational Resources Information Center

    National Institutes of Health, Office of Science Education, 2008

    2008-01-01

    Mini-Med Schools are public education programs now offered by more than 70 medical schools, universities, research institutions, and hospitals across the nation. There are even Mini-Med Schools in Ireland, Malta, and Canada! The program is typically a lecture series that meets once a week and provides "mini-med students" information on some of the…

  12. Against Explanatory Minimalism in Psychiatry

    PubMed Central

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell’s criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein’s Zettel. But attention to the context of Wittgenstein’s remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  13. Against Explanatory Minimalism in Psychiatry.

    PubMed

    Thornton, Tim

    2015-01-01

    The idea that psychiatry contains, in principle, a series of levels of explanation has been criticized not only as empirically false but also, by Campbell, as unintelligible because it presupposes a discredited pre-Humean view of causation. Campbell's criticism is based on an interventionist-inspired denial that mechanisms and rational connections underpin physical and mental causation, respectively, and hence underpin levels of explanation. These claims echo some superficially similar remarks in Wittgenstein's Zettel. But attention to the context of Wittgenstein's remarks suggests a reason to reject explanatory minimalism in psychiatry and reinstate a Wittgensteinian notion of levels of explanation. Only in a context broader than the one provided by interventionism is that the ascription of propositional attitudes, even in the puzzling case of delusions, justified. Such a view, informed by Wittgenstein, can reconcile the idea that the ascription mental phenomena presupposes a particular level of explanation with the rejection of an a priori claim about its connection to a neurological level of explanation. PMID:26696908

  14. Minimalism through intraoperative functional mapping.

    PubMed

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term". PMID:9247814

  15. Minimally invasive medial hip approach.

    PubMed

    Chiron, P; Murgier, J; Cavaignac, E; Pailhé, R; Reina, N

    2014-10-01

    The medial approach to the hip via the adductors, as described by Ludloff or Ferguson, provides restricted visualization and incurs a risk of neurovascular lesion. We describe a minimally invasive medial hip approach providing broader exposure of extra- and intra-articular elements in a space free of neurovascular structures. With the lower limb in a "frog-leg" position, the skin incision follows the adductor longus for 6cm and then the aponeurosis is incised. A slide plane between all the adductors and the aponeurosis is easily released by blunt dissection, with no interposed neurovascular elements. This gives access to the lesser trochanter, psoas tendon and inferior sides of the femoral neck and head, anterior wall of the acetabulum and labrum. We report a series of 56 cases, with no major complications: this approach allows treatment of iliopsoas muscle lesions and resection or filling of benign tumors of the cervical region and enables intra-articular surgery (arthrolysis, resection of osteophytes or foreign bodies, labral suture). PMID:25164350

  16. Minimal complexity control law synthesis

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Haddad, Wassim M.; Nett, Carl N.

    1989-01-01

    A paradigm for control law design for modern engineering systems is proposed: Minimize control law complexity subject to the achievement of a specified accuracy in the face of a specified level of uncertainty. Correspondingly, the overall goal is to make progress towards the development of a control law design methodology which supports this paradigm. Researchers achieve this goal by developing a general theory of optimal constrained-structure dynamic output feedback compensation, where here constrained-structure means that the dynamic-structure (e.g., dynamic order, pole locations, zero locations, etc.) of the output feedback compensation is constrained in some way. By applying this theory in an innovative fashion, where here the indicated iteration occurs over the choice of the compensator dynamic-structure, the paradigm stated above can, in principle, be realized. The optimal constrained-structure dynamic output feedback problem is formulated in general terms. An elegant method for reducing optimal constrained-structure dynamic output feedback problems to optimal static output feedback problems is then developed. This reduction procedure makes use of star products, linear fractional transformations, and linear fractional decompositions, and yields as a byproduct a complete characterization of the class of optimal constrained-structure dynamic output feedback problems which can be reduced to optimal static output feedback problems. Issues such as operational/physical constraints, operating-point variations, and processor throughput/memory limitations are considered, and it is shown how anti-windup/bumpless transfer, gain-scheduling, and digital processor implementation can be facilitated by constraining the controller dynamic-structure in an appropriate fashion.

  17. On Modelling Minimal Disease Activity

    PubMed Central

    Jackson, Christopher H.; Su, Li; Gladman, Dafna D.

    2016-01-01

    Objective To explore methods for statistical modelling of minimal disease activity (MDA) based on data from intermittent clinic visits. Methods The analysis was based on a 2‐state model. Comparisons were made between analyses based on “complete case” data from visits at which MDA status was known, and the use of hidden model methodology that incorporated information from visits at which only some MDA defining criteria could be established. Analyses were based on an observational psoriatic arthritis cohort. Results With data from 856 patients and 7,024 clinic visits, analysis was based on virtually all visits, although only 62.6% provided enough information to determine MDA status. Estimated mean times for an episode of MDA varied from 4.18 years to 3.10 years, with smaller estimates derived from the hidden 2‐state model analysis. Over a 10‐year period, the estimated expected times spent in MDA episodes of longer than 1 year was 3.90 to 4.22, and the probability of having such an MDA episode was estimated to be 0.85 to 0.91, with longer times and greater probabilities seen with the hidden 2‐state model analysis. Conclusion A 2‐state model provides a useful framework for the analysis of MDA. Use of data from visits at which MDA status can not be determined provide more precision, and notable differences are seen in estimated quantities related to MDA episodes based on complete case and hidden 2‐state model analyses. The possibility of bias, as well as loss of precision, should be recognized when complete case analyses are used. PMID:26315478

  18. Power allocation strategies to minimize energy consumption in wireless body area networks.

    PubMed

    Kailas, Aravind

    2011-01-01

    The wide scale deployment of wireless body area networks (WBANs) hinges on designing energy efficient communication protocols to support the reliable communication as well as to prolong the network lifetime. Cooperative communications, a relatively new idea in wireless communications, offers the benefits of multi-antenna systems, thereby improving the link reliability and boosting energy efficiency. In this short paper, the advantages of resorting to cooperative communications for WBANs in terms of minimized energy consumption are investigated. Adopting an energy model that encompasses energy consumptions in the transmitter and receiver circuits, and transmitting energy per bit, it is seen that cooperative transmission can improve energy efficiency of the wireless network. In particular, the problem of optimal power allocation is studied with the constraint of targeted outage probability. Two strategies of power allocation are considered: power allocation with and without posture state information. Using analysis and simulation-based results, two key points are demonstrated: (i) allocating power to the on-body sensors making use of the posture information can reduce the total energy consumption of the WBAN; and (ii) when the channel condition is good, it is better to recruit less relays for cooperation to enhance energy efficiency. PMID:22254777

  19. Heart bypass surgery - minimally invasive - discharge

    MedlinePlus

    ... Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Chest . ... bypass surgery - minimally invasive Heart failure - overview High blood cholesterol ...

  20. WASTE MINIMIZATION ASSESSMENT FOR A DAIRY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has funded a pilot project to assist small- and medium-size manufacturers who want to minimize their generation of waste but who lack the expertise to do so. n an effort to assist these manufacturers, Waste Minimization Assessment Ce...