Science.gov

Sample records for end-to-end spectrum reconstruction

  1. Left circumflex artery aneurysm: the end-to-end reconstruction.

    PubMed

    Cuttone, Fabio; Guilbeau-Frugier, Céline; Roncalli, Jérome; Glock, Yves

    2015-08-01

    This report describes a surgical myocardial revascularization procedure for a huge, atherosclerotic left circumflex coronary artery aneurysm. The technique proposed in this paper is based on the isolation of the aneurysm followed by the aneurysmectomy and a complete reconstruction of the circumflex artery by an end-to-end anastomosis.

  2. End-to-end ductal anastomosis in biliary reconstruction: indications and limitations.

    PubMed

    Jabłonska, Beata

    2014-08-01

    End-to-end ductal anastomosis is a physiologic biliary reconstruction that is commonly used in liver transplantation and less frequently in the surgical treatment of iatrogenic bile duct injuries. Currently, end-to-end ductal anastomosis is the biliary reconstruction of choice for liver transplantation in most adult patients. In recent years, it has also been performed for liver transplantation in children and in select patients with primary sclerosing cholangitis. The procedure is also performed in some patients with iatrogenic bile duct injuries, as it establishes physiologic bile flow. Proper digestion and absorption as well as postoperative endoscopic access are possible in patients who undergo end-to-end ductal anastomosis. It allows endoscopic diagnostic and therapeutic procedures in patients following surgery. This anastomosis is technically simple and associated with fewer early postoperative complications than the Roux-en-Y hepaticojejunostomy; however, end-to-end ductal anastomosis is not possible to perform in all patients. This review discusses the indications for and limitations of this biliary reconstruction, the technique used in liver transplantation and surgical repair of injured bile ducts, suture types and use of a T-tube.

  3. Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS

    NASA Astrophysics Data System (ADS)

    Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.

    Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.

  4. End-to-End Commitment

    NASA Technical Reports Server (NTRS)

    Newcomb, John

    2004-01-01

    The end-to-end test would verify the complex sequence of events from lander separation to landing. Due to the large distances involved and the significant delay time in sending a command and receiving verification, the lander needed to operate autonomously after it separated from the orbiter. It had to sense conditions, make decisions, and act accordingly. We were flying into a relatively unknown set of conditions-a Martian atmosphere of unknown pressure, density, and consistency to land on a surface of unknown altitude, and one which had an unknown bearing strength.

  5. TROPOMI end-to-end performance studies

    NASA Astrophysics Data System (ADS)

    Voors, Robert; de Vries, Johan; Veefkind, Pepijn; Gloudemans, Annemieke; Mika, Àgnes; Levelt, Pieternel

    2008-10-01

    The TROPOspheric Monitoring Instrument (TROPOMI) is a UV/VIS/NIR/SWIR non-scanning nadir viewing imaging spectrometer that combines a wide swath (110°) with high spatial resolution (8 x 8 km). Its main heritages are from the Ozone Monitoring Instrument (OMI) and from SCIAMACHY. Since its launch in 2004 OMI has been providing, on a daily basis and on a global scale, a wealth of data on ozone, NO2 and minor trace gases, aerosols and local pollution, a scanning spectrometer launched in 2004. The TROPOMI UV/VIS/NIR and SWIR heritage is a combination of OMI and SCIAMACHY. In the framework of development programs for a follow-up mission for the successful Ozone Monitoring Instrument, we have developed the so-called TROPOMI Integrated Development Environment. This is a GRID based software simulation tool for OMI follow-up missions. It includes scene generation, an instrument simulator, a level 0-1b processing chain, as well as several level 1b-2 processing chains. In addition it contains an error-analyzer, i.e. a tool to feedback the level 2 results to the input of the scene generator. The paper gives a description of the TROPOMI instrument and focuses on design aspects as well as on the performance, as tested in the end-to-end development environment TIDE.

  6. Applying Trustworthy Computing to End-to-End Electronic Voting

    ERIC Educational Resources Information Center

    Fink, Russell A.

    2010-01-01

    "End-to-End (E2E)" voting systems provide cryptographic proof that the voter's intention is captured, cast, and tallied correctly. While E2E systems guarantee integrity independent of software, most E2E systems rely on software to provide confidentiality, availability, authentication, and access control; thus, end-to-end integrity is not…

  7. Standardizing an End-to-end Accounting Service

    NASA Technical Reports Server (NTRS)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  8. End-to-end azido-pinned interlocking lanthanide squares.

    PubMed

    Li, Xiao-Lei; Wu, Jianfeng; Zhao, Lang; Shi, Wei; Cheng, Peng; Tang, Jinkui

    2017-03-09

    A rare end-to-end azido-pinned interlocking lanthanide square was self-assembled using a ditopic Schiff-base (H2L) and NaN3 as ligands. Obvious ferromagnetic interaction and a record anisotropy barrier of 152(4) K among lanthanide azido-bridged SMMs in a zero dc field were observed.

  9. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  10. End-to-end simulations for the LISA Technology Package

    NASA Astrophysics Data System (ADS)

    Hannen, V. M.; Smit, M.; Hoyng, P.; Selig, A.; Schleicher, A.

    2003-05-01

    We present an end-to-end simulation facility which has been developed in the framework of the LISA Technology Package (LTP) architect study for SMART-2, the technology demonstration mission that precedes LISA. The simulator evolves positions and orientations of the spacecraft and two test masses contained in the inertial sensors of LTP under the influence of external and internal forces and torques and under the influence of control loops for satellite drag-free control (DFC) and electrostatic test mass suspension. Altogether, a coupled system with 18 degrees of freedom is solved numerically. A series of test runs has been performed to verify the correct functioning of the various models contained in the end-to-end simulator and to provide a preliminary assessment of the performance of DFC algorithms and control laws for test mass suspension, which are currently foreseen for use in the basic operation mode of LTP.

  11. Measurements and analysis of end-to-end Internet dynamics

    SciTech Connect

    Paxson, Vern

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N2) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  12. Miniature modular microwave end-to-end receiver

    NASA Technical Reports Server (NTRS)

    Sukamto, Lin M. (Inventor); Cooley, Thomas W. (Inventor); Janssen, Michael A. (Inventor); Parks, Gary S. (Inventor)

    1993-01-01

    An end-to-end microwave receiver system contained in a single miniature hybrid package mounted on a single heatsink is presented. It includes an input end connected to a microwave receiver antenna and an output end which produces a digital count proportional to the amplitude of a signal of a selected microwave frequency band received at the antenna and corresponding to one of the water vapor absorption lines near frequencies of 20 GHz or 30 GHz. The hybrid package is on the order of several centimeters in length and a few centimeters in height and width. The package includes an L-shaped carrier having a base surface, a vertical wall extending up from the base surface and forming a corner therewith, and connection pins extending through the vertical wall. Modular blocks rest on the base surface against the vertical wall and support microwave monolithic integrated circuits on top surfaces thereof connected to the external connection pins. The modular blocks lie end-to-end on the base surface so as to be modularly removable by sliding along the base surface beneath the external connection pins away from the vertical wall.

  13. The CarbonSat End-to-End Simulator

    NASA Astrophysics Data System (ADS)

    Bramstedt, Klaus; Noel, Stefan; Bovensmann, Heinrich; Reuter, Max; Burrows, John P.; Jurado Lozano, Pedro Jose; Meijer, Yasjka; Loescher, Armin; Acarreta, Juan R.; Sturm, Philipp; Tesmer, Volker; Sanchez Monero, Ana Maria; Atapuerca Rodreiguez de Dios, Francisco Javier; Toledano Sanchez, Daniel; Boesch, Hartmut

    2016-08-01

    The objective of the CarbonSat mission is to improve our knowledge on natural and anthropogenic sources and sinks of CO2 and CH4. CarbonSat was one of the two candidate missions selected for definition studies for becoming Earth Explorer 8 (EE8).The CarbonSat End-to-End Simulator (CSE2ES) simulates the full data flow of the mission with a set of modules embedded in ESA's generic simulation framework OpenSF. A Geometry Module (GM) defines the orbital geometry and related parameters. A Scene Generation Module (SGM) provides simulated radiances and irradiances for the selected scenes. The Level 1 Module (L1M) compromises the instrument simulator and the Level 1b processor, and provide as main output calibrated spectra. The L1M is implemented in two versions, reflecting the instrument concepts from the two competing industrial system studies. The Level 2 Retrieval Module (L2M) performs the retrieval from the input level 1b spectra to the atmospheric parameters (CO2 and CH4).In this paper, we show sensitivity studies with respect to atmospheric parameters, simulations along the orbit and a case study for the detection of a point source emitting carbon dioxide. In summary, the end-to-end simulation with CSE2ES proves the capability of the CarbonSat concept to reach its requirements.

  14. Constructing end-to-end models using ECOPATH data

    NASA Astrophysics Data System (ADS)

    Steele, John H.; Ruzicka, James J.

    2011-09-01

    The wide availability of ECOPATH data sets provides a valuable resource for the comparative analysis of marine ecosystems. We show how to derive a bottom-up transform from the top-down ECOPATH; couple this to a simple microbial web with physical forcing; and use the end-to-end model (E2E) for scenario construction. This steady state format also provides a framework and initial conditions for different dynamic simulations. This model can be applied to shelf ecosystems with a wide range of physical forcing, coupled benthic/pelagic food webs, and nutrient recycling. We illustrate the general application and the specific problems by transforming an ECOPATH model for the Northern California Current (NCC). We adapt results on the upwelling regime to provide estimates of physical fluxes and use these to show the consequences of different upwelling rates combined with variable retention mechanism for plankton, for the productivity of fish and other top predators; and for the resilience of the ecosystem. Finally we show how the effects of inter-annual to decadal variations in upwelling on fishery yields can be studied using dynamic simulations with different prey-predator relations. The general conclusion is that the nature of the physical regimes for shelf ecosystems cannot be ignored in comparing end-to-end representations of these food webs.

  15. On routing algorithms with end-to-end delay guarantees

    SciTech Connect

    Rao, N.S.V.; Batsell, S.G.

    1998-11-01

    The authors consider the transmission of a message of size r from a source to a destination with guarantees on the end-to-end delay over a computer network with n nodes and m links. There are three sources of delays: (a) propagation delays along the links, (b) delays due to bandwidth availability on the links, and (c) queuing delays at the intermediate nodes. First, the authors consider that delays on various links and nodes are given as functions of the message size. If the delay in (b) is a non-increasing function of the bandwidth, they propose O(m{sup 2} + mn log n) time algorithm to compute a path with the minimum end-to-end delay for any given message size r. They then consider that the queuing delay in (c) is a random variable correlated with the message size according to an unknown distribution. At each node, the measurements of queuing delays and message sizes are available. They propose two algorithms to compute paths whose delays are close to optimal delays with a high probability, irrespective of the distribution of the delays, and based entirely on the measurements of sufficient size.

  16. End-to-end network/application performance troubleshooting methodology

    SciTech Connect

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  17. Euclid end-to-end straylight performance assessment

    NASA Astrophysics Data System (ADS)

    Gaspar Venancio, Luis M.; Pachot, Charlotte; Carminati, Lionel; Lorenzo Alvarez, Jose; Amiaux, Jérôme; Prieto, Eric; Bonino, Luciana; Salvignol, Jean-Christophe; Short, Alex; Boenke, Tobias; Strada, Paulo; Laureijs, Rene

    2016-07-01

    In the Euclid mission the straylight has been identified at an early stage as the main driver for the final imaging quality of the telescope. The assessment by simulation of the final straylight in the focal plane of both instruments in Euclid's payload have required a complex workflow involving all stakeholders in the mission, from industry to the scientific community. The straylight is defined as a Normalized Detector Irradiance (NDI) which is a convenient definition tool to separate the contributions of the telescope and of the instruments. The end-to-end straylight of the payload is then simply the sum of the NDIs of the telescope and of each instrument. The NDIs for both instruments are presented in this paper for photometry and spectrometry.

  18. Response to MRO's end-to-end data accountability challenges

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    2005-01-01

    (MRO) on August 12, 2005. It carries six science instruments and three engineering payloads. Because MRO will produce an unprecedented number of science products, it will transmit a much higher data volume via high data rate than any other deep space mission to date. Keeping track of MRO products as well as relay products would be a daunting, expensive task without a well-planned data-product tracking strategy. To respond to this challenge, the MRO project developed the End-to- End Data Accountability System by utilizing existing information available from both ground and flight elements. Therefore, a capability to perform first-order problem diagnosis is essential in order for MRO to answer the questions, where is my data? and when will my data be available? This paper details the approaches taken, design and implementation of the tools, procedures and teams that track data products from the time they are predicted until they arrive in the hands of the end users.

  19. Key management for large scale end-to-end encryption

    SciTech Connect

    Witzke, E.L.

    1994-07-01

    Symmetric end-to-end encryption requires separate keys for each pair of communicating confidants. This is a problem of Order N{sup 2}. Other factors, such as multiple sessions per pair of confidants and multiple encryption points in the ISO Reference Model complicate key management by linear factors. Public-key encryption can reduce the number of keys managed to a linear problem which is good for scaleability of key management, but comes with complicating issues and performance penalties. Authenticity is the primary ingredient of key management. If each potential pair of communicating confidants can authenticate data from each other, then any number of public encryption keys of any type can be communicated with requisite integrity. These public encryption keys can be used with the corresponding private keys to exchange symmetric cryptovariables for high data rate privacy protection. The Digital Signature Standard (DSS), which has been adopted by the United States Government, has both public and private components, similar to a public-key cryptosystem. The Digital Signature Algorithm of the DSS is intended for authenticity but not for secrecy. In this paper, the authors will show how the use of the Digital Signature Algorithm combined with both symmetric and asymmetric (public-key) encryption techniques can provide a practical solution to key management scaleability problems, by reducing the key management complexity to a problem of order N, without sacrificing the encryption speed necessary to operate in high performance networks.

  20. OGC standards for end-to-end sensor network integration

    NASA Astrophysics Data System (ADS)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  1. Data analysis pipeline for EChO end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Waldmann, Ingo P.; Pascale, E.

    2015-12-01

    Atmospheric spectroscopy of extrasolar planets is an intricate business. Atmospheric signatures typically require a photometric precision of 1×10-4 in flux over several hours. Such precision demands high instrument stability as well as an understanding of stellar variability and an optimal data reduction and removal of systematic noise. In the context of the EChO mission concept, we here discuss the data reduction and analysis pipeline developed for the EChO end-to-end simulator EChOSim. We present and discuss the step by step procedures required in order to obtain the final exoplanetary spectrum from the EChOSim `raw data' using a simulated observation of the secondary eclipse of the hot-Neptune 55 Cnc e.

  2. Vascular Coupling System for End-to-End Anastomosis: An In Vivo Pilot Case Report.

    PubMed

    Li, Huizhong; Gale, Bruce; Shea, Jill; Sant, Himanshu; Terry, Christi M; Agarwal, Jay

    2017-03-01

    This paper presents the latest in vivo findings of a novel vascular coupling system. Vascular anastomosis is a common procedure in reconstructive surgeries and traditional hand suturing is very time consuming. The vascular coupling system described herein was designed to be used on arteries for a rapid and error-free anastomosis. The system consists of an engaging ring made from high density polyethylene using computer numerical control machining and a back ring made from polymethylmethacrylate using laser cutting. The vascular coupling system and its corresponding installation tools were tested in a pilot animal study to evaluate their efficacy in completing arterial anastomosis. A segment of expanded polytetrafluoroethylene (ePTFE) tubing was interposed into a transected carotid artery by anastomosis using two couplers in a pig. Two end-to-end anastomoses were accomplished. Ultrasound images were obtained to evaluate the blood flow at the anastomotic site immediately after the surgery. MRI was also performed 2 weeks after the surgery to evaluate vessel and ePTFE graft patency. This anastomotic system demonstrated high efficacy and easy usability, which should facilitate vascular anastomosis procedures in trauma and reconstructive surgeries.

  3. End-to-End Optimization of High-Throughput DNA Sequencing.

    PubMed

    O'Reilly, Eliza; Baccelli, Francois; De Veciana, Gustavo; Vikalo, Haris

    2016-10-01

    At the core of Illumina's high-throughput DNA sequencing platforms lies a biophysical surface process that results in a random geometry of clusters of homogeneous short DNA fragments typically hundreds of base pairs long-bridge amplification. The statistical properties of this random process and the lengths of the fragments are critical as they affect the information that can be subsequently extracted, that is, density of successfully inferred DNA fragment reads. The ensembles of overlapping DNA fragment reads are then used to computationally reconstruct the much longer target genome sequence. The success of the reconstruction in turn depends on having a sufficiently large ensemble of DNA fragments that are sufficiently long. In this article using stochastic geometry, we model and optimize the end-to-end flow cell synthesis and target genome sequencing process, linking and partially controlling the statistics of the physical processes to the success of the final computational step. Based on a rough calibration of our model, we provide, for the first time, a mathematical framework capturing the salient features of the sequencing platform that serves as a basis for optimizing cost, performance, and/or sensitivity analysis to various parameters.

  4. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  5. Direct end-to-end repair of flexor pollicis longus tendon lacerations.

    PubMed

    Nunley, J A; Levin, L S; Devito, D; Goldner, R D; Urbaniak, J R

    1992-01-01

    Between 1976 and 1986, 38 consecutive acute isolated flexor pollicis longus lacerations were repaired. This study excluded all replanted or mutilated digits and all lacerations with associated fracture. Average follow-up was 26 months. Tendon rehabilitation was standardized. Range of motion and pinch strength were measured postoperatively. Seventy-four percent (28/38) of the flexor pollicis longus injuries occurred in zone II. Neurovascular injury occurred in 82% of the lacerations, and this correlated with the zone of tendon injury. In 21% of the patients (8/38) both digital nerves and arteries were transected. Postoperative thumb interphalangeal motion averaged 35 degrees and key pinch strength was 81% that of the uninjured thumb. One rupture occurred in a child. Laceration of the flexor pollicis longus is likely to involve damage to neurovascular structures, and repair may be necessary. Direct end-to-end repairs within the pulley system do at least as well as delayed tendon reconstruction and do not require additional procedures.

  6. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    NASA Astrophysics Data System (ADS)

    Sun, Jidi; Dowling, Jason; Pichler, Peter; Menk, Fred; Rivest-Henault, David; Lambert, Jonathan; Parker, Joel; Arm, Jameen; Best, Leah; Martin, Jarad; Denham, James W.; Greer, Peter B.

    2015-04-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation. A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities. Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs. The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT.

  7. Effects of collagen membranes enriched with in vitro-differentiated N1E-115 cells on rat sciatic nerve regeneration after end-to-end repair

    PubMed Central

    2010-01-01

    Peripheral nerves possess the capacity of self-regeneration after traumatic injury but the extent of regeneration is often poor and may benefit from exogenous factors that enhance growth. The use of cellular systems is a rational approach for delivering neurotrophic factors at the nerve lesion site, and in the present study we investigated the effects of enwrapping the site of end-to-end rat sciatic nerve repair with an equine type III collagen membrane enriched or not with N1E-115 pre-differentiated neural cells. After neurotmesis, the sciatic nerve was repaired by end-to-end suture (End-to-End group), end-to-end suture enwrapped with an equine collagen type III membrane (End-to-EndMemb group); and end-to-end suture enwrapped with an equine collagen type III membrane previously covered with neural cells pre-differentiated in vitro from N1E-115 cells (End-to-EndMembCell group). Along the postoperative, motor and sensory functional recovery was evaluated using extensor postural thrust (EPT), withdrawal reflex latency (WRL) and ankle kinematics. After 20 weeks animals were sacrificed and the repaired sciatic nerves were processed for histological and stereological analysis. Results showed that enwrapment of the rapair site with a collagen membrane, with or without neural cell enrichment, did not lead to any significant improvement in most of functional and stereological predictors of nerve regeneration that we have assessed, with the exception of EPT which recovered significantly better after neural cell enriched membrane employment. It can thus be concluded that this particular type of nerve tissue engineering approach has very limited effects on nerve regeneration after sciatic end-to-end nerve reconstruction in the rat. PMID:20149260

  8. End-to-End Models for Effects of System Noise on LIMS Analysis of Igneous Rocks

    SciTech Connect

    Clegg, Samuel M; Bender, Steven; Wiens, R. C.; Carmosino, Marco L; Speicher, Elly A; Dyar, M. D.

    2010-12-23

    The ChemCam instrument on the Mars Science Laboratory will be the first extraterrestial deployment of laser-induced breakdown spectroscopy (UBS) for remote geochemical analysis. LIBS instruments are also being proposed for future NASA missions. In quantitative LIBS applications using multivariate analysis techniques, it is essential to understand the effects of key instrument parameters and their variability on the elemental predictions. Baseline experiments were run on a laboratory instrument in conditions reproducing ChemCam performance on Mars. These experiments employed Nd:YAG laser producing 17 mJ/pulse on target and an with a 200 {micro}m FWHM spot size on the surface of a sample. The emission is collected by a telescope, imaged on a fiber optic and then interfaced to a demultiplexer capable of >40% transmission into each spectrometer. We report here on an integrated end-to-end system performance model that simulates the effects of output signal degradation that might result from the input signal chain and the impact on multivariate model predictions. There are two approaches to modifying signal to noise (SNR): degrade the signal and/or increase the noise. Ishibashi used a much smaller data set to show that the addition of noise had significant impact while degradation of spectral resolution had much less impact on accuracy and precision. Here, we specifically focus on aspects of remote LIBS instrument performance as they relate to various types of signal degradation. To assess the sensitivity of LIBS analysis to signal-to-noise ratio (SNR) and spectral resolution, the signal in each spectrum from a suite of 50 laboratory spectra of igneous rocks was variably degraded by increasing the peak widths (simulating misalignment) and decreasing the spectral amplitude (simulating decreases in SNR).

  9. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  10. An end-to-end communications architecture for condition-based maintenance applications

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  11. End-to-End Information System design at the NASA Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    Recognizing a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote space-based sensor, an end-to-end approach to the design of information systems has been adopted at the Jet Propulsion Laboratory. The objectives of this effort are to ensure that all flight projects adequately cope with information flow problems at an early stage of system design, and that cost-effective, multi-mission capabilities are developed when capital investments are made in supporting elements. The paper reviews the End-to-End Information System (EEIS) activity at the Laboratory, and notes the ties to the NASA End-to-End Data System program.

  12. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  13. Scalable end-to-end encryption technology for supra-gigabit/second networking

    SciTech Connect

    Pierson, L.G.; Tarman, T.D.; Witzke, E.L.

    1997-05-01

    End-to-end encryption can protect proprietary information as it passes through a complex inter-city computer network, even if the intermediate systems are untrusted. This technique involves encrypting the body of computer messages while leaving network addressing and control information unencrypted for processing by intermediate network nodes. Because high speed implementations of end-to-end encryption with easy key management for standard network protocols are unavailable, this technique is not widely used today. Specifically, no end-to-end encryptors exist to protect Asynchronous Transfer Mode (ATM) traffic, nor to protect Switched Multi-megabit Data Service (SMDS), which is the first ``Broadband Integrated Services Digital Network`` (BISDN) service now being used by long distance telephone companies. This encryption technology is required for the protection of data in transit between industrial sites and central Massively Parallel Supercomputing Centers over high bandwidth, variable bit rate (BISDN) services. This research effort investigated techniques to scale end-to-end encryption technology from today`s state of the art ({approximately} 0.001 Gb/s) to 2.4 Gb/s and higher. A cryptosystem design has been developed which scales for implementation beyond SONET OC-48 (2.4Gb/s) data rates. A prototype for use with OC-3 (0.155 Gb/s) ATM variable bit rate services was developed.

  14. An end-to-end vechicle classification pipeline using vibrometry data

    NASA Astrophysics Data System (ADS)

    Smith, Ashley; Mendoza-Schrock, Olga; Kangas, Scott; Dierking, Matthew; Shaw, Arnab

    2014-06-01

    This paper evaluates and expands upon the existing end-to-end process used for vibrometry target classification and identification. A fundamental challenge in vehicle classification using vibrometry signature data is the determination of robust signal features. The methodology used in this paper involves comparing the performance of features taken from automatic speech recognition, seismology, and structural analysis work. These features provide a means to reduce the dimensionality of the data for the possibility of improved separability. The performances of different groups of features are compared to determine the best feature set for vehicle classification. Standard performance metrics are implemented to provide a method of evaluation. The contribution of this paper is to (1) thoroughly explain the time domain and frequency domain features that have been recently applied to the vehicle classification using laser-vibrometry data domain, (2) build an end-to-end classification pipeline for Aided Target Recognition (ATR) with common and easily accessible tools, and (3) apply feature selection methods to the end-to-end pipeline. The end-to-end process used here provides a structured path for accomplishing vibrometry-based target identification. This paper will compare with two studies in the public domain. The techniques utilized in this paper were utilized to analyze a small in-house database of several different vehicles.

  15. Development and evaluation of an end-to-end test for head and neck IMRT with a novel multiple-dosimetric modality phantom.

    PubMed

    Zakjevskii, Viatcheslav V; Knill, Cory S; Rakowski, Joseph T; Snyder, Michael G

    2016-03-08

    A comprehensive end-to-end test for head and neck IMRT treatments was developed using a custom phantom designed to utilize multiple dosimetry devices. Initial end-to-end test and custom H&N phantom were designed to yield maximum information in anatomical regions significant to H&N plans with respect to: (i) geometric accuracy, (ii) dosimetric accuracy, and (iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. The phantom was imaged on a CT simulator and the CT was reconstructed with 1 mm slice thickness and imported into Varian's Eclipse treatment planning system. OARs and the PTV were contoured with the aid of Smart Segmentation. A clinical template was used to create an eight-field IMRT plan and dose was calculated with heterogeneity correction on. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film, ion chambers, and optically stimulated luminescent dosimeters (OSLDs). Ion chamber dose measurements were compared to the treatment planning system. Films were analyzed with FilmQA Pro using composite gamma index. OSLDs were read with a MicroStar reader using a custom calibration curve. Final phantom design incorporated two axial and one coronal film planes with 18 OSLD locations adjacent to those planes as well as four locations for IMRT ionization chambers below inferior film plane. The end-to-end test was consistently reproducible, resulting in average gamma pass rate greater than 99% using 3%/3 mm analysis criteria, and average OSLD and ion chamber measurements within 1% of planned dose. After initial calibration of OSLD and film systems, the end-to-end test provides next-day results, allowing for integration in routine clinical QA. Preliminary trials have demonstrated that our end-to-end is a reproducible QA tool that enables the ongoing evaluation of dosimetric and geometric accuracy of clinical head and neck treatments.

  16. End-to-end RMS error testing on a constant bandwidth FM/FM system

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Salter, W. E.

    1972-01-01

    End-to-end root-mean-square (rms) tests performed on a constant bandwidth FM/FM system with various settings of system parameters are reported. The testing technique employed is that of sampling, digitizing, delaying, and comparing the analog input against the sampled and digitized corresponding output. Total system error is determined by fully loading all channels with band-limited noise and conducting end-to-end rms error tests on one channel. Tests are also conducted with and without a transmission link and plots of rms errors versus receiver signal-to-noise (S/N) values are obtained. The combined effects of intermodulation, adjacent channel crosstalk, and residual system noise are determined as well as the single channel distortion of the system.

  17. Laser welding with an albumin stent: experimental ureteral end-to-end anastomosis

    NASA Astrophysics Data System (ADS)

    Xie, Hua; Shaffer, Brian S.; Prahl, Scott A.; Gregory, Kenton W.

    2000-05-01

    Porcine ureters were anastomosed using an albumin stent and diode laser in vitro. The albumin stent provided precise apposition for an end to end anastomosis and enhanced welding strength. The anastomosis seam was lasered with an 810 nm diode laser using continuous wave and pulse light through a hand-held 600 micrometer noncontact optical fiber. Tensile strength, burst pressures, operative times, total energy and thermal damaged were measured in this study. The results demonstrated that using an albumin stent to laser weld ureteral anastomoses produces strong weld strengths. The liquid albumin solder also provided satisfactory welding strength. There were no significant differences of tissue thermal damage between the albumin stent alone, liquid solder alone and both combination groups. Thermal damage to tissue depended on laser setting and energy. This study determined the appropriate laser setting parameters to perform in vivo ureteral end to end anastomosis.

  18. Sutureless end-to-end ureteral anastomosis using a new albumin stent and diode laser

    NASA Astrophysics Data System (ADS)

    Xie, Hua; Shaffer, Brian S.; Prahl, Scott A.; Gregory, Kenton W.

    1999-09-01

    Sutureless end to end ureteral anastomoses was successfully constructed in acute and chronic experiments. A photothermal sensitive hydrolyzable (PSH) albumin stent played roles as solder and intraluminal supporter to adhesion and position the anastomosed ureter by end to end fashion. The anastomosis seam was lased with 810 nm diode laser energy supplied through hand- held 600 micrometers noncontact optical fiber. A continuous 1 watt wave of power was applied for laser anastomosis. Integrity, welding strength, bursting pressures of anastomosis and histological reaction, and radiological phenomena were compared to those of anastomoses constructed using a liquidity soldering technique. The acute results of two methods were equivalent at welding strengths, but the liquid soldering showed more energy consumption. At chronic study, the radiological and histological studies were performed to evaluate the complications of the anastomosis. Excellent heating and varied degrees of complications were observed. We conclude that PSH stent showed great promise for ureteral anastomosis using laser welding.

  19. Providing end-to-end QoS for multimedia applications in 3G wireless networks

    NASA Astrophysics Data System (ADS)

    Guo, Katherine; Rangarajan, Samapth; Siddiqui, M. A.; Paul, Sanjoy

    2003-11-01

    As the usage of wireless packet data services increases, wireless carriers today are faced with the challenge of offering multimedia applications with QoS requirements within current 3G data networks. End-to-end QoS requires support at the application, network, link and medium access control (MAC) layers. We discuss existing CDMA2000 network architecture and show its shortcomings that prevent supporting multiple classes of traffic at the Radio Access Network (RAN). We then propose changes in RAN within the standards framework that enable support for multiple traffic classes. In addition, we discuss how Session Initiation Protocol (SIP) can be augmented with QoS signaling for supporting end-to-end QoS. We also review state of the art scheduling algorithms at the base station and provide possible extensions to these algorithms to support different classes of traffic as well as different classes of users.

  20. End-to-end network models encompassing terrestrial, wireless, and satellite components

    NASA Astrophysics Data System (ADS)

    Boyarko, Chandler L.; Britton, John S.; Flores, Phil E.; Lambert, Charles B.; Pendzick, John M.; Ryan, Christopher M.; Shankman, Gordon L.; Williams, Ramon P.

    2004-08-01

    Development of network models that reflect true end-to-end architectures such as the Transformational Communications Architecture need to encompass terrestrial, wireless and satellite component to truly represent all of the complexities in a world wide communications network. Use of best-in-class tools including OPNET, Satellite Tool Kit (STK), Popkin System Architect and their well known XML-friendly definitions, such as OPNET Modeler's Data Type Description (DTD), or socket-based data transfer modules, such as STK/Connect, enable the sharing of data between applications for more rapid development of end-to-end system architectures and a more complete system design. By sharing the results of and integrating best-in-class tools we are able to (1) promote sharing of data, (2) enhance the fidelity of our results and (3) allow network and application performance to be viewed in the context of the entire enterprise and its processes.

  1. Severing and end-to-end annealing of neurofilaments in neurons

    PubMed Central

    Uchida, Atsuko; Çolakoğlu, Gülsen; Wang, Lina; Monsma, Paula C.; Brown, Anthony

    2013-01-01

    We have shown previously that neurofilaments and vimentin filaments expressed in nonneuronal cell lines can lengthen by joining ends in a process known as “end-to-end annealing.” To test if this also occurs for neurofilaments in neurons, we transfected cultured rat cortical neurons with fluorescent neurofilament fusion proteins and then used photoconversion or photoactivation strategies to create distinct populations of red and green fluorescent filaments. Within several hours we observed the appearance of chimeric filaments consisting of alternating red and green segments, which is indicative of end-to-end annealing of red and green filaments. However, the appearance of these chimeric filaments was accompanied by a gradual fragmentation of the red and green filament segments, which is indicative of severing. Over time we observed a progressive increase in the number of red–green junctions along the filaments accompanied by a progressive decrease in the average length of the alternating red and green fluorescent segments that comprised those filaments, suggesting a dynamic cycle of severing and end-to-end-annealing. Time-lapse imaging of the axonal transport of chimeric filaments demonstrated that the red and green segments moved together, confirming that they were indeed part of the same filament. Moreover, in several instances, we also were able to capture annealing and severing events live in time-lapse movies. We propose that the length of intermediate filaments in cells is regulated by the opposing actions of severing and end-to-end annealing, and we speculate that this regulatory mechanism may influence neurofilament transport within axons. PMID:23821747

  2. The International Space Station Alpha (ISSA) End-to-End On-Orbit Maintenance Process Flow

    NASA Technical Reports Server (NTRS)

    Zingrebe, Kenneth W., II

    1995-01-01

    As a tool for construction and refinement of the on-orbit maintenance system to sustain the International Space Station Alpha (ISSA), the Mission Operations Directorate (MOD) developed an end to-end on-orbit maintenance process flow. This paper discusses and demonstrates that process flow. This tool is being used by MOD to identify areas which require further work in preparation for MOD's role in the conduct of on-orbit maintenance operations.

  3. End-to-end System Performance Simulation: A Data-Centric Approach

    NASA Astrophysics Data System (ADS)

    Guillaume, Arnaud; Laffitte de Petit, Jean-Luc; Auberger, Xavier

    2013-08-01

    In the early times of space industry, the feasibility of Earth observation missions was directly driven by what could be achieved by the satellite. It was clear to everyone that the ground segment would be able to deal with the small amount of data sent by the payload. Over the years, the amounts of data processed by the spacecrafts have been increasing drastically, leading to put more and more constraints on the ground segment performances - and in particular on timeliness. Nowadays, many space systems require high data throughputs and short response times, with information coming from multiple sources and involving complex algorithms. It has become necessary to perform thorough end-to-end analyses of the full system in order to optimise its cost and efficiency, but even sometimes to assess the feasibility of the mission. This paper presents a novel framework developed by Astrium Satellites in order to meet these needs of timeliness evaluation and optimisation. This framework, named ETOS (for “End-to-end Timeliness Optimisation of Space systems”), provides a modelling process with associated tools, models and GUIs. These are integrated thanks to a common data model and suitable adapters, with the aim of building suitable space systems simulators of the full end-to-end chain. A big challenge of such environment is to integrate heterogeneous tools (each one being well-adapted to part of the chain) into a relevant timeliness simulation.

  4. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    NASA Astrophysics Data System (ADS)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  5. Sulfate-Mediated End-to-End Assembly of Gold Nanorods.

    PubMed

    Abtahi, S M H; Burrows, Nathan D; Idesis, Fred A; Murphy, Catherine J; Saleh, Navid B; Vikesland, Peter J

    2017-02-14

    There is interest in the controlled aggregation of gold nanorods (GNRs) for the production of extended nanoassemblies. Prior studies have relied upon chemical modification of the GNR surface to achieve a desired final aggregate structure. Herein we illustrate that control of electrolyte composition can facilitate end-to-end assembly of cetyltrimethylammonium-bromide-coated (CTAB) GNRs. By adjusting either the sulfate anion concentration or the exposure time it is possible to connect GNRs in chain-like assemblies. In contrast, end-to-end assembly was not observed in control experiments using monovalent chloride salts. We attribute the end-to-end assembly to the localized association of sulfate with exposed quaternary ammonium head groups of CTAB at the nanorod tip. To quantify the assembly kinetics, visible-near-infrared extinction spectra were collected over a predetermined time period, and the colloidal behavior of the GNR suspensions was interpreted using plasmon band analysis. Transmission electron microscopy and atomic force microscopy results support the conclusions reached via plasmon band analysis, and the colloidal behavior is consistent with Derjaguin-Landau-Verwey-Overbeek theory.

  6. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    NASA Astrophysics Data System (ADS)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  7. End-to-end modeling: a new modular and flexible approach

    NASA Astrophysics Data System (ADS)

    Genoni, M.; Riva, M.; Landoni, M.; Pariani, G.

    2016-08-01

    In this paper we present an innovative philosophy to develop the End-to-End model for astronomical observation projects, i.e. the architecture which allows physical modeling of the whole system from the light source to the reduced data. This alternative philosophy foresees the development of the physical model of the different modules, which compose the entire End-to-End system, directly during the project design phase. This approach is strongly characterized by modularity and flexibility; these aspects will be of relevant importance in the next generation astronomical observation projects like E-ELT (European Extremely Large Telescope) because of the high complexity and long-time design and development. With this approach it will be possible to keep the whole system and its different modules efficiently under control during every project phase and to exploit a reliable tool at a system engineering level to evaluate the effects on the final performance both of the main parameters and of different instrument architectures and technologies. This philosophy will be important to allow scientific community to perform in advance simulations and tests on the scientific drivers. This will translate in a continuous feedback to the (system) design process with a resulting improvement in the effectively achievable scientific goals and consistent tool for efficiently planning observation proposals and programs. We present the application case for this End-to-End modeling technique, which is the high resolution spectrograph at the E-ELT (E-ELT HIRES). In particular we present the definition of the system modular architecture, describing the interface parameters of the modules.

  8. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  9. The first sutureless, laser-welded, end-to-end bowel anastomosis.

    PubMed

    Sauer, J S; Hinshaw, J R; McGuire, K P

    1989-01-01

    The use of laser energy to weld together tissue offers great promise in the expanding field of laser surgery. The published results of laser welding intestinal tissue have, to date, been limited to the successful laser closures of small enterotomies. This is the first report of using laser energy alone to create an end-to-end small bowel anastomosis. A biocompatible, water-soluble, intraluminal stent was employed during the laser welding of this sutureless, stapleless ileal anastomosis in a rabbit model. Excellent recovery and healing were observed. The rapidity, ease, and potential for full precision automation of laser welding mandates further research.

  10. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    NASA Astrophysics Data System (ADS)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    End-to-end marine ecosystem models link climate and oceanography to the food web and human activities. These models can be used as forecasting tools, to strategically evaluate management options and to support ecosystem-based management. Here we report the results of such forecasts in the California Current, using an Atlantis end-to-end model. We worked collaboratively with fishery managers at NOAA’s regional offices and staff at the National Marine Sanctuaries (NMS) to explore the impact of fishery policies on management objectives at different spatial scales, from single Marine Sanctuaries to the entire Northern California Current. In addition to examining Status Quo management, we explored the consequences of several gear switching and spatial management scenarios. Of the scenarios that involved large scale management changes, no single scenario maximized all performance metrics. Any policy choice would involve trade-offs between stakeholder groups and policy goals. For example, a coast-wide 25% gear shift from trawl to pot or longline appeared to be one possible compromise between an increase in spatial management (which sacrificed revenue) and scenarios such as the one consolidating bottom impacts to deeper areas (which did not perform substantially differently from Status Quo). Judged on a coast-wide scale, most of the scenarios that involved minor or local management changes (e.g. within Monterey Bay NMS only) yielded results similar to Status Quo. When impacts did occur in these cases, they often involved local interactions that were difficult to predict a priori based solely on fishing patterns. However, judged on the local scale, deviation from Status Quo did emerge, particularly for metrics related to stationary species or variables (i.e. habitat and local metrics of landed value or bycatch). We also found that isolated management actions within Monterey Bay NMS would cause local fishers to pay a cost for conservation, in terms of reductions in landed

  11. Deriving comprehensive error breakdown for wide field adaptive optics systems using end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Ferreira, F.; Gendron, E.; Rousset, G.; Gratadour, D.

    2016-07-01

    The future European Extremely Large Telescope (E-ELT) adaptive optics (AO) systems will aim at wide field correction and large sky coverage. Their performance will be improved by using post processing techniques, such as point spread function (PSF) deconvolution. The PSF estimation involves characterization of the different error sources in the AO system. Such error contributors are difficult to estimate: simulation tools are a good way to do that. We have developed in COMPASS (COMputing Platform for Adaptive opticS Systems), an end-to-end simulation tool using GPU (Graphics Processing Unit) acceleration, an estimation tool that provides a comprehensive error budget by the outputs of a single simulation run.

  12. LWIR scene simulator developed for end-to-end performance evaluation of focal planes

    NASA Technical Reports Server (NTRS)

    Thompson, Niels A.; Bowser, William M.; Song, Sung H.; Skiff, Laura T.; Powell, William W.; Romero, Charles

    1992-01-01

    The development of a long-wave infrared optical simulator facilitates evaluation of the end-to-end performance of long wavelength infrared (LWIR) focal plane arrays (FPAs) in a system-like environment. This simulator provides selectable structured scene inputs to a focal plane module or array. Background irradiances as low as 10 exp 10 photons/sq cm s are achievable when the simulator is cooled with liquid helium. The optical simulator can generate single or multiple targets of controllable intensities, and uniform or structured background irradiances. The infrared scenes can be viewed in a stationary mode or dynamically scanned across the focal plane.

  13. End-to-end science from major facilities: does the VO have a role?

    NASA Astrophysics Data System (ADS)

    Gilmore, Gerard F.

    2007-08-01

    The Virtual Observatory provides a natural solution to the existence problem in communications: how can one ask a question of another unless you know the other exists? Many think e-mail from apparent strangers, e-blogs, etc., suggest there is no shortage of possible such solutions. In that context, is the Virtual Observatory in fact the necessary and desirable part of the solution? Specifically, does the VO necessarily play a critical role in delivering end-to-end facility science, from ideas, through proposals, resources/facilities, to distributed, reviewed, knowledge? If not, what else needs to be added?

  14. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  15. An End-To-End Test of A Simulated Nuclear Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    VanDyke, Melissa; Hrbud, Ivana; Goddfellow, Keith; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase I Space Fission Systems issues in it particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  16. An End-To-End Near-Earth Asteroid Resource Exploitation Plan

    NASA Astrophysics Data System (ADS)

    Reed, K. L.

    2000-01-01

    The possible end result of the utilization of raw materials garnered from near-Earth asteroids (NEAs) has been well documented if often a bit fanciful. Very few have put forward an end-to-end plan from prospecting to mine closure for any specific asteroid or for any particular asteroid resource. There are many aspects to planning for the mining of raw materials from asteroids that have never been encountered in terrestrial resource exploitation due to the dispersed nature of the asteroids. As an example from petroleum exploration, if a dry hole is drilled in a large geologic setting indicative of petroleum deposits, one only need pack the drill rig up and move it to a new spot. In asteroid exploitation, the problem of "moving to a new spot" is complicated, as the "new spot" is moving constantly and may be many millions of kilometers distant at great cost in time and rocket fuel. This paper will outline a relatively low-risk, probable high-return, end-to-end plan for the exploitation and utilization of asteroid raw materials. All aspects of exploration and mining will attempt to be addressed, from prospecting, exploration, and evaluation of possible resources to initialization, industrialization, and closure of the mine. It will attempt to plan for the acquisition of not just the needed scientific knowledge, but also to plan for acquisition of the engineering and geotechnical knowledge needed for effective mining of a small planetary object.

  17. Micro-Precision Interferometer Testbed: end-to-end system integration of control structure interaction technologies

    NASA Astrophysics Data System (ADS)

    Neat, Gregory W.; Sword, Lee F.; Hines, Braden E.; Calvet, Robert J.

    1993-09-01

    This paper describes the overall design and planned phased delivery of the ground-based Micro-Precision Interferometer (MPI) Testbed. The testbed is a half scale replica of a future space-based interferometer containing all the spacecraft subsystems necessary to perform an astrometric measurement. Appropriate sized reaction wheels will regulate the testbed attitude as well as provide a flight-like disturbance source. The optical system will consist of two complete Michelson interferometers. Successful interferometric measurements require controlling the positional stabilities of these optical elements to the nanometer level. The primary objective of the testbed is to perform a system integration of Control Structure Interaction (CSI) technologies necessary to demonstrate the end-to-end operation of a space- based interferometer, ultimately proving to flight mission planners that the necessary control technology exists to meet the challenging requirements of future space-based interferometry missions. These technologies form a multi-layered vibration attenuation architecture to achieve the necessary quiet environment. This three layered methodology blends disturbance isolation, structural quieting and active optical control techniques. The paper describes all the testbed subsystems in this end-to-end ground-based system as well as the present capabilities of the evolving testbed.

  18. End-to-End Rate-Distortion Optimized MD Mode Selection for Multiple Description Video Coding

    NASA Astrophysics Data System (ADS)

    Heng, Brian A.; Apostolopoulos, John G.; Lim, Jae S.

    2006-12-01

    Multiple description (MD) video coding can be used to reduce the detrimental effects caused by transmission over lossy packet networks. A number of approaches have been proposed for MD coding, where each provides a different tradeoff between compression efficiency and error resilience. How effectively each method achieves this tradeoff depends on the network conditions as well as on the characteristics of the video itself. This paper proposes an adaptive MD coding approach which adapts to these conditions through the use of adaptive MD mode selection. The encoder in this system is able to accurately estimate the expected end-to-end distortion, accounting for both compression and packet loss-induced distortions, as well as for the bursty nature of channel losses and the effective use of multiple transmission paths. With this model of the expected end-to-end distortion, the encoder selects between MD coding modes in a rate-distortion (R-D) optimized manner to most effectively tradeoff compression efficiency for error resilience. We show how this approach adapts to both the local characteristics of the video and network conditions and demonstrates the resulting gains in performance using an H.264-based adaptive MD video coder.

  19. A real-time 3D end-to-end augmented reality system (and its representation transformations)

    NASA Astrophysics Data System (ADS)

    Tytgat, Donny; Aerts, Maarten; De Busser, Jeroen; Lievens, Sammy; Rondao Alface, Patrice; Macq, Jean-Francois

    2016-09-01

    The new generation of HMDs coming to the market is expected to enable many new applications that allow free viewpoint experiences with captured video objects. Current applications usually rely on 3D content that is manually created or captured in an offline manner. In contrast, this paper focuses on augmented reality applications that use live captured 3D objects while maintaining free viewpoint interaction. We present a system that allows live dynamic 3D objects (e.g. a person who is talking) to be captured in real-time. Real-time performance is achieved by traversing a number of representation formats and exploiting their specific benefits. For instance, depth images are maintained for fast neighborhood retrieval and occlusion determination, while implicit surfaces are used to facilitate multi-source aggregation for both geometry and texture. The result is a 3D reconstruction system that outputs multi-textured triangle meshes at real-time rates. An end-to-end system is presented that captures and reconstructs live 3D data and allows for this data to be used on a networked (AR) device. For allocating the different functional blocks onto the available physical devices, a number of alternatives are proposed considering the available computational power and bandwidth for each of the components. As we will show, the representation format can play an important role in this functional allocation and allows for a flexible system that can support a highly heterogeneous infrastructure.

  20. Cyberinfrastructure to support Real-time, End-to-End, High Resolution, Localized Forecasting

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.; Lindholm, D.; Baltzer, T.; Domenico, B.

    2004-12-01

    From natural disasters such as flooding and forest fires to man-made disasters such as toxic gas releases, the impact of weather-influenced severe events on society can be profound. Understanding, predicting, and mitigating such local, mesoscale events calls for a cyberinfrastructure to integrate multidisciplinary data, tools, and services as well as the capability to generate and use high resolution data (such as wind and precipitation) from localized models. The need for such end to end systems -- including data collection, distribution, integration, assimilation, regionalized mesoscale modeling, analysis, and visualization -- has been realized to some extent in many academic and quasi-operational environments, especially for atmospheric sciences data. However, many challenges still remain in the integration and synthesis of data from multiple sources and the development of interoperable data systems and services across those disciplines. Over the years, the Unidata Program Center has developed several tools that have either directly or indirectly facilitated these local modeling activities. For example, the community is using Unidata technologies such as the Internet Data Distribution (IDD) system, Local Data Manger (LDM), decoders, netCDF libraries, Thematic Realtime Environmental Distributed Data Services (THREDDS), and the Integrated Data Viewer (IDV) in their real-time prediction efforts. In essence, these technologies for data reception and processing, local and remote access, cataloging, and analysis and visualization coupled with technologies from others in the community are becoming the foundation of a cyberinfrastructure to support an end-to-end regional forecasting system. To build on these capabilities, the Unidata Program Center is pleased to be a significant contributor to the Linked Environments for Atmospheric Discovery (LEAD) project, a NSF-funded multi-institutional large Information Technology Research effort. The goal of LEAD is to create an

  1. End-to-End Beam Simulations for the New Muon G-2 Experiment at Fermilab

    SciTech Connect

    Korostelev, Maxim; Bailey, Ian; Herrod, Alexander; Morgan, James; Morse, William; Stratakis, Diktys; Tishchenko, Vladimir; Wolski, Andrzej

    2016-06-01

    The aim of the new muon g-2 experiment at Fermilab is to measure the anomalous magnetic moment of the muon with an unprecedented uncertainty of 140 ppb. A beam of positive muons required for the experiment is created by pion decay. Detailed studies of the beam dynamics and spin polarization of the muons are important to predict systematic uncertainties in the experiment. In this paper, we present the results of beam simulations and spin tracking from the pion production target to the muon storage ring. The end-to-end beam simulations are developed in Bmad and include the processes of particle decay, collimation (with accurate representation of all apertures) and spin tracking.

  2. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be small as a 40 ppm is a difficult and exacting task. The end-to end processing of determining planetary candidates from noisy, raw photometric measurements is discussed.

  3. End-to-end system test for solid-state microdosemeters.

    PubMed

    Pisacane, V L; Dolecek, Q E; Malak, H; Dicello, J F

    2010-08-01

    The gold standard in microdosemeters has been the tissue equivalent proportional counter (TEPC) that utilises a gas cavity. An alternative is the solid-state microdosemeter that replaces the gas with a condensed phase (silicon) detector with microscopic sensitive volumes. Calibrations of gas and solid-state microdosemeters are generally carried out using radiation sources built into the detector that impose restrictions on their handling, transportation and licensing in accordance with the regulations from international, national and local nuclear regulatory bodies. Here a novel method is presented for carrying out a calibration and end-to-end system test of a microdosemeter using low-energy photons as the initiating energy source, thus obviating the need for a regulated ionising radiation source. This technique may be utilised to calibrate both a solid-state microdosemeter and, with modification, a TEPC with the higher average ionisation energy of a gas.

  4. End-to-end assessment of a large aperture segmented ultraviolet optical infrared (UVOIR) telescope architecture

    NASA Astrophysics Data System (ADS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Olivier; Stark, Chris; Arenberg, Jon

    2016-07-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield exo-earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an exo-earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and exo-earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling these missions.

  5. Enhancing End-to-End Performance of Information Services Over Ka-Band Global Satellite Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul B.; Glover, Daniel R.; Ivancic, William D.; vonDeak, Thomas C.

    1997-01-01

    The Internet has been growing at a rapid rate as the key medium to provide information services such as e-mail, WWW and multimedia etc., however its global reach is limited. Ka-band communication satellite networks are being developed to increase the accessibility of information services via the Internet at global scale. There is need to assess satellite networks in their ability to provide these services and interconnect seamlessly with existing and proposed terrestrial telecommunication networks. In this paper the significant issues and requirements in providing end-to-end high performance for the delivery of information services over satellite networks based on various layers in the OSI reference model are identified. Key experiments have been performed to evaluate the performance of digital video and Internet over satellite-like testbeds. The results of the early developments in ATM and TCP protocols over satellite networks are summarized.

  6. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  7. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGES

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; ...

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  8. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  9. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  10. End-to-end commissioning demonstration of the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Acton, D. Scott; Towell, Timothy; Schwenker, John; Shields, Duncan; Sabatke, Erin; Contos, Adam R.; Hansen, Karl; Shi, Fang; Dean, Bruce; Smith, Scott

    2007-09-01

    The one-meter Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate the design and implementation of the wavefront sensing and control (WFSC) capabilities of the James Webb Space Telescope (JWST). We have recently conducted an "end-to-end" demonstration of the flight commissioning process on the TBT. This demonstration started with the Primary Mirror (PM) segments and the Secondary Mirror (SM) in random positions, traceable to the worst-case flight deployment conditions. The commissioning process detected and corrected the deployment errors, resulting in diffraction-limited performance across the entire science FOV. This paper will describe the commissioning demonstration and the WFSC algorithms used at each step in the process.

  11. An end-to-end secure patient information access card system.

    PubMed

    Alkhateeb, A; Singer, H; Yakami, M; Takahashi, T

    2000-03-01

    The rapid development of the Internet and the increasing interest in Internet-based solutions has promoted the idea of creating Internet-based health information applications. This will force a change in the role of IC cards in healthcare card systems from a data carrier to an access key medium. At the Medical Informatics Department of Kyoto University Hospital we are developing a smart card patient information project where patient databases are accessed via the Internet. Strong end-to-end data encryption is performed via Secure Socket Layers, transparent to transmit patient information. The smart card is playing the crucial role of access key to the database: user authentication is performed internally without ever revealing the actual key. For easy acceptance by healthcare professionals, the user interface is integrated as a plug-in for two familiar Web browsers, Netscape Navigator and MS Internet Explorer.

  12. End-to-end simulations of the E-ELT/METIS coronagraphs

    NASA Astrophysics Data System (ADS)

    Carlomagno, Brunella; Absil, Olivier; Kenworthy, Matthew; Ruane, Garreth; Keller, Christoph U.; Otten, Gilles; Feldt, Markus; Hippler, Stefan; Huby, Elsa; Mawet, Dimitri; Delacroix, Christian; Surdej, Jean; Habraken, Serge; Forsberg, Pontus; Karlsson, Mikael; Vargas Catalan, Ernesto; Brandl, Bernhard R.

    2016-07-01

    The direct detection of low-mass planets in the habitable zone of nearby stars is an important science case for future E-ELT instruments such as the mid-infrared imager and spectrograph METIS, which features vortex phase masks and apodizing phase plates (APP) in its baseline design. In this work, we present end-to-end performance simulations, using Fourier propagation, of several METIS coronagraphic modes, including focal-plane vortex phase masks and pupil-plane apodizing phase plates, for the centrally obscured, segmented E-ELT pupil. The atmosphere and the AO contributions are taken into account. Hybrid coronagraphs combining the advantages of vortex phase masks and APPs are considered to improve the METIS coronagraphic performance.

  13. End-to-end performance measurement of Internet based medical applications.

    PubMed Central

    Dev, P.; Harris, D.; Gutierrez, D.; Shah, A.; Senger, S.

    2002-01-01

    We present a method to obtain an end-to-end characterization of the performance of an application over a network. This method is not dependent on any specific application or type of network. The method requires characterization of network parameters, such as latency and packet loss, between the expected server or client endpoints, as well as characterization of the application's constraints on these parameters. A subjective metric is presented that integrates these characterizations and that operates over a wide range of applications and networks. We believe that this method may be of wide applicability as research and educational applications increasingly make use of computation and data servers that are distributed over the Internet. PMID:12463816

  14. End-to-End Network Simulation Using a Site-Specific Radio Wave Propagation Model

    SciTech Connect

    Djouadi, Seddik M; Kuruganti, Phani Teja; Nutaro, James J

    2013-01-01

    The performance of systems that rely on a wireless network depends on the propagation environment in which that network operates. To predict how these systems and their supporting networks will perform, simulations must take into consideration the propagation environment and how this effects the performance of the wireless network. Network simulators typically use empirical models of the propagation environment. However, these models are not intended for, and cannot be used, to predict a wireless system will perform in a specific location, e.g., in the center of a particular city or the interior of a specific manufacturing facility. In this paper, we demonstrate how a site-specific propagation model and the NS3 simulator can be used to predict the end-to-end performance of a wireless network.

  15. End-to-end test of the electron-proton spectrometer

    NASA Technical Reports Server (NTRS)

    Cash, B. L.

    1972-01-01

    A series of end-to-end tests were performed to demonstrate the proper functioning of the complete Electron-Proton Spectrometer (EPS). The purpose of the tests was to provide experimental verification of the design and to provide a complete functional performance check of the instrument from the excitation of the sensors to and including the data processor and equipment test set. Each of the channels of the EPS was exposed to a calibrated beam of energetic particles, and counts were accumulated for a predetermined period of time for each of several energies. The counts were related to the known flux of particles to give a monodirectional response function for each channel. The measured response function of the test unit was compared to the response function determined for the calibration sensors from the data taken from the calibration program.

  16. Orion MPCV GN and C End-to-End Phasing Tests

    NASA Technical Reports Server (NTRS)

    Neumann, Brian C.

    2013-01-01

    End-to-end integration tests are critical risk reduction efforts for any complex vehicle. Phasing tests are an end-to-end integrated test that validates system directional phasing (polarity) from sensor measurement through software algorithms to end effector response. Phasing tests are typically performed on a fully integrated and assembled flight vehicle where sensors are stimulated by moving the vehicle and the effectors are observed for proper polarity. Orion Multi-Purpose Crew Vehicle (MPCV) Pad Abort 1 (PA-1) Phasing Test was conducted from inertial measurement to Launch Abort System (LAS). Orion Exploration Flight Test 1 (EFT-1) has two end-to-end phasing tests planned. The first test from inertial measurement to Crew Module (CM) reaction control system thrusters uses navigation and flight control system software algorithms to process commands. The second test from inertial measurement to CM S-Band Phased Array Antenna (PAA) uses navigation and communication system software algorithms to process commands. Future Orion flights include Ascent Abort Flight Test 2 (AA-2) and Exploration Mission 1 (EM-1). These flights will include additional or updated sensors, software algorithms and effectors. This paper will explore the implementation of end-to-end phasing tests on a flight vehicle which has many constraints, trade-offs and compromises. Orion PA-1 Phasing Test was conducted at White Sands Missile Range (WSMR) from March 4-6, 2010. This test decreased the risk of mission failure by demonstrating proper flight control system polarity. Demonstration was achieved by stimulating the primary navigation sensor, processing sensor data to commands and viewing propulsion response. PA-1 primary navigation sensor was a Space Integrated Inertial Navigation System (INS) and Global Positioning System (GPS) (SIGI) which has onboard processing, INS (3 accelerometers and 3 rate gyros) and no GPS receiver. SIGI data was processed by GN&C software into thrust magnitude and

  17. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    SciTech Connect

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, and proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.

  18. End-to-end performance analysis using engineering confidence models and a ground processor prototype

    NASA Astrophysics Data System (ADS)

    Kruse, Klaus-Werner; Sauer, Maximilian; Jäger, Thomas; Herzog, Alexandra; Schmitt, Michael; Huchler, Markus; Wallace, Kotska; Eisinger, Michael; Heliere, Arnaud; Lefebvre, Alain; Maher, Mat; Chang, Mark; Phillips, Tracy; Knight, Steve; de Goeij, Bryan T. G.; van der Knaap, Frits; Van't Hof, Adriaan

    2015-10-01

    The European Space Agency (ESA) and the Japan Aerospace Exploration Agency (JAXA) are co-operating to develop the EarthCARE satellite mission with the fundamental objective of improving the understanding of the processes involving clouds, aerosols and radiation in the Earth's atmosphere. The EarthCARE Multispectral Imager (MSI) is relatively compact for a space borne imager. As a consequence, the immediate point-spread function (PSF) of the instrument will be mainly determined by the diffraction caused by the relatively small optical aperture. In order to still achieve a high contrast image, de-convolution processing is applied to remove the impact of diffraction on the PSF. A Lucy-Richardson algorithm has been chosen for this purpose. This paper will describe the system setup and the necessary data pre-processing and post-processing steps applied in order to compare the end-to-end image quality with the L1b performance required by the science community.

  19. End-to-End QoS for Differentiated Services and ATM Internetworking

    NASA Technical Reports Server (NTRS)

    Su, Hongjun; Atiquzzaman, Mohammed

    2001-01-01

    The Internet was initially design for non real-time data communications and hence does not provide any Quality of Service (QoS). The next generation Internet will be characterized by high speed and QoS guarantee. The aim of this paper is to develop a prioritized early packet discard (PEPD) scheme for ATM switches to provide service differentiation and QoS guarantee to end applications running over next generation Internet. The proposed PEPD scheme differs from previous schemes by taking into account the priority of packets generated from different application. We develop a Markov chain model for the proposed scheme and verify the model with simulation. Numerical results show that the results from the model and computer simulation are in close agreement. Our PEPD scheme provides service differentiation to the end-to-end applications.

  20. Sieving of H2 and D2 Through End-to-End Nanotubes

    NASA Astrophysics Data System (ADS)

    Devagnik, Dasgupta; Debra, J. Searles; Lamberto, Rondoni; Stefano, Bernardi

    2014-10-01

    We study the quantum molecular sieving of H2 and D2 through two nanotubes placed end-to-end. An analytic treatment, assuming that the particles have classical motion along the axis of the nanotube and are confined in a potential well in the radial direction, is considered. Using this idealistic model, and under certain conditions, it is found that this device can act as a complete sieve, allowing chemically pure deuterium to be isolated from an isotope mixture. We also consider a more realistic model of two carbon nanotubes and carry out molecular dynamics simulations using a Feynman—Hibbs potential to model the quantum effects on the dynamics of H2 and D2. Sieving is also observed in this case, but is caused by a different process.

  1. End-to-end communication test on variable length packet structures utilizing AOS testbed

    NASA Technical Reports Server (NTRS)

    Miller, Warner H.; Sank, V.; Fong, Wai; Miko, J.; Powers, M.; Folk, John; Conaway, B.; Michael, K.; Yeh, Pen-Shu

    1994-01-01

    This paper describes a communication test, which successfully demonstrated the transfer of losslessly compressed images in an end-to-end system. These compressed images were first formatted into variable length Consultative Committee for Space Data Systems (CCSDS) packets in the Advanced Orbiting System Testbed (AOST). The CCSDS data Structures were transferred from the AOST to the Radio Frequency Simulations Operations Center (RFSOC), via a fiber optic link, where data was then transmitted through the Tracking and Data Relay Satellite System (TDRSS). The received data acquired at the White Sands Complex (WSC) was transferred back to the AOST where the data was captured and decompressed back to the original images. This paper describes the compression algorithm, the AOST configuration, key flight components, data formats, and the communication link characteristics and test results.

  2. Development of a Dynamic, End-to-End Free Piston Stirling Convertor Model

    NASA Technical Reports Server (NTRS)

    Regan, Timothy F.; Gerber, Scott S.; Roth, Mary Ellen

    2004-01-01

    A dynamic model for a free-piston Stirling convertor is being developed at the NASA Glenn Research Center. The model is an end-to-end system model that includes the cycle thermodynamics, the dynamics, and electrical aspects of the system. The subsystems of interest are the heat source, the springs, the moving masses, the linear alternator, the controller, and the end-user load. The envisioned use of the model will be in evaluating how changes in a subsystem could affect the operation of the convertor. The model under development will speed the evaluation of improvements to a subsystem and aid in determining areas in which most significant improvements may be found. One of the first uses of the end-toend model will be in the development of controller architectures. Another related area is in evaluating changes to details in the linear alternator.

  3. Somatic cells efficiently join unrelated DNA segments end-to-end.

    PubMed Central

    Wilson, J H; Berget, P B; Pipas, J M

    1982-01-01

    Molecular substrates for probing nonhomologous recombination in somatic cells were constructed by inserting pBR322 sequences at selected sites on the simian virus 40 (SV40) genome. The chimeric products are too large to be packaged into an SV40 capsid. Therefore, production of viable progeny requires that most of the pBR322 sequences be deleted without altering any SV40 sequences that are essential for lytic infection. As judged by plaque assay, these recombination events occur at readily detectable frequencies after transfection into CV1 monkey kidney cells. Depending on the site of pBR322 insertion, the infectivities of the full-length circular or linear chimeras ranged from 0.02 to 2% of the infectivity of linear wild-type SV40 DNA. Nucleotide sequence analysis of several recombinant progeny revealed three distinct classes of recombination junction and indicated that the causative recombination events were minimally dependent on sequence homology. Potential mechanisms involving recombination at internal sites or at ends were distinguished by measuring the infectivity of chimeric molecules from which various lengths of pBR322 had been removed. These data support end-to-end joining as the primary mechanism by which DNA segments recombine nonhomologously in somatic cells. This end joining appears to be very efficient, since SV40 genomes with complementary single-stranded tails or with short non-complementary pBR322 tails were comparably infectious. Overall, this study indicates that mammalian somatic cells are quite efficient at the willy-nilly end-to-end joining of unrelated DNA segments. Images PMID:6294502

  4. Evaluation of NASA's end-to-end data systems using DSDS+

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Davenport, William; Message, Philip

    1994-01-01

    The Data Systems Dynamic Simulator (DSDS+) is a software tool being developed by the authors to evaluate candidate architectures for NASA's end-to-end data systems. Via modeling and simulation, we are able to quickly predict the performance characteristics of each architecture, to evaluate 'what-if' scenarios, and to perform sensitivity analyses. As such, we are using modeling and simulation to help NASA select the optimal system configuration, and to quantify the performance characteristics of this system prior to its delivery. This paper is divided into the following six sections: (1) The role of modeling and simulation in the systems engineering process. In this section, we briefly describe the different types of results obtained by modeling each phase of the systems engineering life cycle, from concept definition through operations and maintenance; (2) Recent applications of DSDS+. In this section, we describe ongoing applications of DSDS+ in support of the Earth Observing System (EOS), and we present some of the simulation results generated of candidate system designs. So far, we have modeled individual EOS subsystems (e.g. the Solid State Recorders used onboard the spacecraft), and we have also developed an integrated model of the EOS end-to-end data processing and data communications systems (from the payloads onboard to the principle investigator facilities on the ground); (3) Overview of DSDS+. In this section we define what a discrete-event model is, and how it works. The discussion is presented relative to the DSDS+ simulation tool that we have developed, including it's run-time optimization algorithms that enables DSDS+ to execute substantially faster than comparable discrete-event simulation tools; (4) Summary. In this section, we summarize our findings and 'lessons learned' during the development and application of DSDS+ to model NASA's data systems; (5) Further Information; and (6) Acknowledgements.

  5. Flexible end-to-end system design for synthetic aperture radar applications

    NASA Astrophysics Data System (ADS)

    Zaugg, Evan C.; Edwards, Matthew C.; Bradley, Joshua P.

    2012-06-01

    This paper presents ARTEMIS, Inc.'s approach to development of end-to-end synthetic aperture radar systems for multiple applications and platforms. The flexible design of the radar and the image processing tools facilitates their inclusion in a variety of application-specific end-to-end systems. Any given application comes with certain requirements that must be met in order to achieve success. A concept of operation is defined which states how the technology is used to meet the requirements of the application. This drives the design decisions. Key to adapting our system to multiple applications is the flexible SlimSAR radar system, which is programmable on-the-fly to meet the imaging requirements of a wide range of altitudes, swath-widths, and platform velocities. The processing software can be used for real-time imagery production or post-flight processing. The ground station is adaptable, and the radar controls can be run by an operator on the ground, on-board the aircraft, or even automated as part of the aircraft autopilot controls. System integration takes the whole operation into account, seeking to flawlessly work with data links and on-board data storage, aircraft and payload control systems, mission planning, and image processing and exploitation. Examples of applications are presented including using a small unmanned aircraft at low altitude with a line of sight data link, a long-endurance UAV maritime surveillance mission with on-board processing, and a manned ground moving target indicator application with the radar using multiple receive channels.

  6. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  7. Development and evaluation of an end-to-end test for head and neck IMRT with a novel multiple-dosimetric modality phantom.

    PubMed

    Zakjevskii, Viatcheslav V; Knill, Cory S; Rakowski, Joseph T; Snyder, Michael G

    2016-03-01

    A comprehensive end-to-end test for head and neck IMRT treatments was developed using a custom phantom designed to utilize multiple dosimetry devices. Initial end-to-end test and custom H&N phantom were designed to yield maximum information in anatomical regions significant to H&N plans with respect to: (i) geometric accuracy, (ii) dosimetric accuracy, and (iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. The phantom was imaged on a CT simulator and the CT was reconstructed with 1 mm slice thickness and imported into Varian's Eclipse treatment planning system. OARs and the PTV were contoured with the aid of Smart Segmentation. A clinical template was used to create an eight-field IMRT plan and dose was calculated with heterogeneity correction on. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film, ion chambers, and optically stimulated luminescent dosimeters (OSLDs). Ion chamber dose measurements were compared to the treatment planning system. Films were analyzed with FilmQA Pro using composite gamma index. OSLDs were read with a MicroStar reader using a custom calibration curve. Final phantom design incorporated two axial and one coronal film planes with 18 OSLD locations adjacent to those planes as well as four locations for IMRT ionization chambers below inferior film plane. The end-to-end test was consistently reproducible, resulting in average gamma pass rate greater than 99% using 3%/3 mm analysis criteria, and average OSLD and ion chamber measurements within 1% of planned dose. After initial calibration of OSLD and film systems, the end-to-end test provides next-day results, allowing for integration in routine clinical QA. Preliminary trials have demonstrated that our end-to-end is a reproducible QA tool that enables the ongoing evaluation of dosimetric and geometric accuracy of clinical head and neck treatments. PACS

  8. End-to-end test of spatial accuracy in Gamma Knife treatments for trigeminal neuralgia

    SciTech Connect

    Brezovich, Ivan A. Wu, Xingen; Duan, Jun; Popple, Richard A.; Shen, Sui; Benhabib, Sidi; Huang, Mi; Christian Dobelbower, M.; Fisher III, Winfield S.

    2014-11-01

    Purpose: Spatial accuracy is most crucial when small targets like the trigeminal nerve are treated. Although current quality assurance procedures typically verify that individual apparatus, like the MRI scanner, CT scanner, Gamma Knife, etc., are meeting specifications, the cumulative error of all equipment and procedures combined may exceed safe margins. This study uses an end-to-end approach to assess the overall targeting errors that may have occurred in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 in.) diameter MRI-contrast filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the location of the cavity matches the Gamma Knife coordinates of an arbitrarily chosen, previously treated patient. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pinprick to identify the cavity center. Treatments are planned for radiation delivery with 4 mm collimators according to MRI and CT scans using the clinical localizer boxes and acquisition protocols. Shots are planned so that the 50% isodose surface encompasses the cavity. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pinprick, which represents the intended target, and the centroid of the 50% isodose line, which is the center of the radiation field that was actually delivered. Results: Averaged over ten patient simulations, targeting errors along the x, y, and z coordinates (patient’s left-to-right, posterior-to-anterior, and head-to-foot) were, respectively, −0.060 ± 0.363, −0.350 ± 0.253, and 0.348 ± 0.204 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely, 0.109 ± 0.167, −0.191 ± 0.144, and 0.211 ± 0.094 mm. The largest errors along individual axes in MRI

  9. Smoothing spline primordial power spectrum reconstruction

    SciTech Connect

    Sealfon, Carolyn; Verde, Licia; Jimenez, Raul

    2005-11-15

    We reconstruct the shape of the primordial power spectrum (PPS) using a smoothing spline. Our adapted smoothing spline technique provides a complementary method to existing efforts to search for smooth features in the PPS, such as a running spectral index. With this technique we find no significant indication with Wilkinson Microwave Anisotropy Probe first-year data that the PPS deviates from a Harrison-Zeldovich spectrum and no evidence for loss of power on large scales. We also examine the effect on the cosmological parameters of the additional PPS freedom. Smooth variations in the PPS are not significantly degenerate with other cosmological parameters, but the spline reconstruction greatly increases the errors on the optical depth and baryon fraction.

  10. Towards end-to-end models for investigating the effects of climate and fishing in marine ecosystems

    NASA Astrophysics Data System (ADS)

    Travers, M.; Shin, Y.-J.; Jennings, S.; Cury, P.

    2007-12-01

    End-to-end models that represent ecosystem components from primary producers to top predators, linked through trophic interactions and affected by the abiotic environment, are expected to provide valuable tools for assessing the effects of climate change and fishing on ecosystem dynamics. Here, we review the main process-based approaches used for marine ecosystem modelling, focusing on the extent of the food web modelled, the forcing factors considered, the trophic processes represented, as well as the potential use and further development of the models. We consider models of a subset of the food web, models which represent the first attempts to couple low and high trophic levels, integrated models of the whole ecosystem, and size spectrum models. Comparisons within and among these groups of models highlight the preferential use of functional groups at low trophic levels and species at higher trophic levels and the different ways in which the models account for abiotic processes. The model comparisons also highlight the importance of choosing an appropriate spatial dimension for representing organism dynamics. Many of the reviewed models could be extended by adding components and by ensuring that the full life cycles of species components are represented, but end-to-end models should provide full coverage of ecosystem components, the integration of physical and biological processes at different scales and two-way interactions between ecosystem components. We suggest that this is best achieved by coupling models, but there are very few existing cases where the coupling supports true two-way interaction. The advantages of coupling models are that the extent of discretization and representation can be targeted to the part of the food web being considered, making their development time- and cost-effective. Processes such as predation can be coupled to allow the propagation of forcing factors effects up and down the food web. However, there needs to be a stronger focus

  11. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks.

    PubMed

    Suhonen, Jukka; Hämäläinen, Timo D; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER.

  12. End-To-End performance test of the LINC-NIRVANA Wavefront-Sensor system.

    NASA Astrophysics Data System (ADS)

    Berwein, Juergen; Bertram, Thomas; Conrad, Al; Briegel, Florian; Kittmann, Frank; Zhang, Xiangyu; Mohr, Lars

    2011-09-01

    LINC-NIRVANA is an imaging Fizeau interferometer, for use in near infrared wavelengths, being built for the Large Binocular Telescope. Multi-conjugate adaptive optics (MCAO) increases the sky coverage and the field of view over which diffraction limited images can be obtained. For its MCAO implementation, Linc-Nirvana utilizes four total wavefront sensors; each of the two beams is corrected by both a ground-layer wavefront sensor (GWS) and a high-layer wavefront sensor (HWS). The GWS controls the adaptive secondary deformable mirror (DM), which is based on an DSP slope computing unit. Whereas the HWS controls an internal DM via computations provided by an off-the-shelf multi-core Linux system. Using wavefront sensor data collected from a prior lab experiment, we have shown via simulation that the Linux based system is sufficient to operate at 1kHz, with jitter well below the needs of the final system. Based on that setup we tested the end-to-end performance and latency through all parts of the system which includes the camera, the wavefront controller, and the deformable mirror. We will present our loop control structure and the results of those performance tests.

  13. An end-to-end assessment of range uncertainty in proton therapy using animal tissues

    NASA Astrophysics Data System (ADS)

    Zheng, Yuanshui; Kang, Yixiu; Zeidan, Omar; Schreuder, Niek

    2016-11-01

    Accurate assessment of range uncertainty is critical in proton therapy. However, there is a lack of data and consensus on how to evaluate the appropriate amount of uncertainty. The purpose of this study is to quantify the range uncertainty in various treatment conditions in proton therapy, using transmission measurements through various animal tissues. Animal tissues, including a pig head, beef steak, and lamb leg, were used in this study. For each tissue, an end-to-end test closely imitating patient treatments was performed. This included CT scan simulation, treatment planning, image-guided alignment, and beam delivery. Radio-chromic films were placed at various depths in the distal dose falloff region to measure depth dose. Comparisons between measured and calculated doses were used to evaluate range differences. The dose difference at the distal falloff between measurement and calculation depends on tissue type and treatment conditions. The estimated range difference was up to 5, 6 and 4 mm for the pig head, beef steak, and lamb leg irradiation, respectively. Our study shows that the TPS was able to calculate proton range within about 1.5% plus 1.5 mm. Accurate assessment of range uncertainty in treatment planning would allow better optimization of proton beam treatment, thus fully achieving proton beams’ superior dose advantage over conventional photon-based radiation therapy.

  14. An end-to-end architecture for distributing weather alerts to wireless handsets

    NASA Astrophysics Data System (ADS)

    Jones, Karen L.; Nguyen, Hung

    2005-06-01

    This paper describes the current National Weather Service's (NWS) system for providing weather alerts in the U.S. and will review how the existing end-to-end architecture is being leveraged to provide non-weather alerts, also known as "all-hazard alerts", to the general public. The paper then describes how a legacy system that transmits weather and all-hazard alerts can be extended via commercial wireless networks and protocols to reach 154 million Americans who carry cell phones. This approach uses commercial SATCOM and existing wireless carriers and services such as Short Messaging Service (SMS) for text and emerging Multimedia Messaging Service (MMS) protocol, which would allow for photos, maps, audio and video alerts to be sent to end users. This wireless broadcast alert delivery architecture is designed to be open and to embrace the National Weather Service's mandate to become an "" warning system for the general public. Examples of other public and private sector applications that require timely and intelligent push mechanisms using this alert dissemination approach are also given.

  15. Semantic Complex Event Processing over End-to-End Data Flows

    SciTech Connect

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  16. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites

    PubMed Central

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G.

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis. PMID:26901845

  17. End-to-end simulation of bunch merging for a muon collider

    SciTech Connect

    Bao, Yu; Stratakis, Diktys; Hanson, Gail G.; Palmer, Robert B.

    2015-05-03

    Muon accelerator beams are commonly produced indirectly through pion decay by interaction of a charged particle beam with a target. Efficient muon capture requires the muons to be first phase-rotated by rf cavities into a train of 21 bunches with much reduced energy spread. Since luminosity is proportional to the square of the number of muons per bunch, it is crucial for a Muon Collider to use relatively few bunches with many muons per bunch. In this paper we will describe a bunch merging scheme that should achieve this goal. We present for the first time a complete end-to-end simulation of a 6D bunch merger for a Muon Collider. The 21 bunches arising from the phase-rotator, after some initial cooling, are merged in longitudinal phase space into seven bunches, which then go through seven paths with different lengths and reach the final collecting "funnel" at the same time. The final single bunch has a transverse and a longitudinal emittance that matches well with the subsequent 6D rectilinear cooling scheme.

  18. An end-to-end analysis of drought from smallholder farms in southwest Jamaica

    NASA Astrophysics Data System (ADS)

    Curtis, W. R. S., III; Gamble, D. W.; Popke, J.

    2015-12-01

    Drought can be defined in many ways: meteorological, hydrological, agricultural, and socio-economic. Another way to approach drought is from a "perception" perspective, where individuals whose livelihood is highly dependent on precipitation take adaptive actions. In this study we use two-years of data collected from twelve smallholder farms in southern St. Elizabeth, Jamaica to undertake an end-to-end analysis of drought. At each farm, 6-hour temperature and soil moisture, and tipping-bucket rainfall were recorded from June 2013 to June 2015, and twice-monthly farmers indicated whether they were experiencing drought and if they irrigated (hand-watering, drip irrigation, or pipe and sprinkler). In many cases half of the farmers considered themselves in a drought, while the others not, even though the largest separation among farms was about 20 km. This study will use analysis of variance to test the following hypotheses: Drought perception is related to a) absolute amounts of precipitation at the time, b) other environmental cues at the time (soil moisture, temperature), or c) relative amounts of precipitation as compared to the same time last year. Irrigation actions and water use following the perception of drought will also be examined.

  19. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  20. The X-IFU end-to-end simulations performed for the TES array optimization exercise

    NASA Astrophysics Data System (ADS)

    Peille, Philippe; Wilms, J.; Brand, T.; Cobo, B.; Ceballos, M. T.; Dauser, T.; Smith, S. J.; Barret, D.; den Herder, J. W.; Piro, L.; Barcons, X.; Pointecouteau, E.; Bandler, S.; den Hartog, R.; de Plaa, J.

    2015-09-01

    The focal plane assembly of the Athena X-ray Integral Field Unit (X-IFU) includes as the baseline an array of ~4000 single size calorimeters based on Transition Edge Sensors (TES). Other sensor array configurations could however be considered, combining TES of different properties (e.g. size). In attempting to improve the X-IFU performance in terms of field of view, count rate performance, and even spectral resolution, two alternative TES array configurations to the baseline have been simulated, each combining a small and a large pixel array. With the X-IFU end-to-end simulator, a sub-sample of the Athena core science goals, selected by the X-IFU science team as potentially driving the optimal TES array configuration, has been simulated for the results to be scientifically assessed and compared. In this contribution, we will describe the simulation set-up for the various array configurations, and highlight some of the results of the test cases simulated.

  1. Forming End-to-End Oligomers of Gold Nanorods Using Porphyrins and Phthalocyanines.

    PubMed

    Stewart, Alexander F; Gagnon, Brandon P; Walker, Gilbert C

    2015-06-23

    The illumination of aggregated metal nanospecies can create strong local electric fields to brighten Raman scattering. This study describes a procedure to self-assemble gold nanorods (NRs) through the use of porphyrin and phthalocyanine agents to create reproducibly stable and robust NR aggregates in the form of end-to-end oligomers. Narrow inter-rod gaps result, creating electric field "hot spots" between the NRs. The organic linker molecules themselves are potential Raman-based optical labels, and the result is significant numbers of Raman-active species located in the hot spots. NR polymerization was quenched by phospholipid encapsulation, which allows for control of the polydispersity of the aggregate solution, to optimize the surface-enhanced Raman scattering (SERS) enhancement and permitted the aqueous solubility of the aggregates. The increased presence of Raman-active species in the hot spots and the optimizing of solution polydispersity resulted in the observation of scattering enhancements by encapsulated porphyrins/phthalocyanines of up to 3500-fold over molecular chromophores lacking the NR oligomer host.

  2. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    SciTech Connect

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  3. Changes in the end-to-end distance distribution in an oligonucleotide following hybridization

    NASA Astrophysics Data System (ADS)

    Parkhurst, Lawrence J.; Parkhurst, Kay M.

    1994-08-01

    A 16-mer deoxy oligonucleotide was labeled at the 5' end with x- rhodamine and at the 3' end with fluorescein. The fluorescence lifetime of the donor, fluorescein, under conditions for resonance energy transfer, was studied using the SLM 4850 multiharmonic frequency phase fluorometer in order to obtain information on the end-to-end distance distribution P(R) in the oligomer. When this doubly labeled oligonucleotide was hybridized to its 16-mer complement, the fluorescein fluorescence decay could be very well described by a P(R) that was a symmetric shifted Gaussian with center at 68.4 angstrom and (sigma) equals6.4 angstrom. Simulations suggested that part of the width might be attributable to a distribution in (kappa) 2. In the single- stranded labeled oligomer, there was enhanced energy transfer from the fluorescein to the rhodamine and the best fitting symmetrical shifted Gaussian representation of P(R) was centered at 53.8 angstrom with (kappa) equals6.9 angstrom. There was significant lack of fit with this model, however. A model independent procedure was developed for extracting P(R) as a sum of weighted Hermite polynomials. This procedure gave a P(R) with a large negative region at R<20 angstrom, suggesting that rotational averaging for (kappa) 2 was not quite complete prior to significant decay of the donor excited state.

  4. Characterisation of residual ionospheric errors in bending angles using GNSS RO end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K. F.; Norman, R.; Li, Y.; Zhang, S. C.; Carter, B.; Fritzer, J.; Schwaerz, M.; Choy, S. L.; Wu, S. Q.; Tan, Z. X.

    2013-09-01

    Global Navigation Satellite System (GNSS) radio occultation (RO) is an innovative meteorological remote sensing technique for measuring atmospheric parameters such as refractivity, temperature, water vapour and pressure for the improvement of numerical weather prediction (NWP) and global climate monitoring (GCM). GNSS RO has many unique characteristics including global coverage, long-term stability of observations, as well as high accuracy and high vertical resolution of the derived atmospheric profiles. One of the main error sources in GNSS RO observations that significantly affect the accuracy of the derived atmospheric parameters in the stratosphere is the ionospheric error. In order to mitigate the effect of this error, the linear ionospheric correction approach for dual-frequency GNSS RO observations is commonly used. However, the residual ionospheric errors (RIEs) can be still significant, especially when large ionospheric disturbances occur and prevail such as during the periods of active space weather. In this study, the RIEs were investigated under different local time, propagation direction and solar activity conditions and their effects on RO bending angles are characterised using end-to-end simulations. A three-step simulation study was designed to investigate the characteristics of the RIEs through comparing the bending angles with and without the effects of the RIEs. This research forms an important step forward in improving the accuracy of the atmospheric profiles derived from the GNSS RO technique.

  5. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    DOE PAGES

    Nord, B.; Amara, A.; Refregier, A.; ...

    2016-03-03

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherentmore » data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.« less

  6. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    SciTech Connect

    Nord, B.; Amara, A.; Refregier, A.; Gamper, La.; Gamper, Lu.; Hambrecht, B.; Chang, C.; Forero-Romero, J. E.; Serrano, S.; Cunha, C.; Coles, O.; Nicola, A.; Busha, M.; Bauer, A.; Saunders, W.; Jouvel, S.; Kirk, D.; Wechsler, R.

    2016-03-03

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). As a result, we discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.

  7. End-to-end performance modeling of passive remote sensing systems

    SciTech Connect

    Smith, B.W.; Borel, C.C.; Clodius, W.B.; Theiler, J.; Laubscher, B.; Weber, P.G.

    1996-07-01

    The ultimate goal of end-to-end system modeling is to simulate all known physical effects which determine the content of the data, before flying an instrument system. In remote sensing, one begins with a scene, viewed either statistically or dynamically, computes the radiance in each spectral band, renders the scene, transfers it through representative atmospheres to create the radiance field at an aperture, and integrates over sensor pixels. We have simulated a comprehensive sequence of realistic instrument hardware elements and the transfer of simulated data to an analysis system. This analysis package is the same as that intended for use of data collections from the real system. By comparing the analyzed image to the original scene, the net effect of nonideal system components can be understood. Iteration yields the optimum values of system parameters to achieve performance targets. We have used simulation to develop and test improved multispectral algorithms for (1) the robust retrieval of water surface temperature, water vapor column, and other quantities; (2) the preservation of radiometric accuracy during atmospheric correction and pixel registration on the ground; and (3) exploitation of on-board multispectral measurements to assess the atmosphere between ground and aperture.

  8. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    NASA Astrophysics Data System (ADS)

    Nord, B.; Amara, A.; Réfrégier, A.; Gamper, La.; Gamper, Lu.; Hambrecht, B.; Chang, C.; Forero-Romero, J. E.; Serrano, S.; Cunha, C.; Coles, O.; Nicola, A.; Busha, M.; Bauer, A.; Saunders, W.; Jouvel, S.; Kirk, D.; Wechsler, R.

    2016-04-01

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.

  9. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    NASA Technical Reports Server (NTRS)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  10. Development of an End-to-End Model for Free-Space Optical Communications

    NASA Astrophysics Data System (ADS)

    Hemmati, H.

    2005-05-01

    Through funding by NASA's Exploration Systems Research and Technology (ESR&T) Program and the Advanced Space Technology Program (ASTP), a team, including JPL, Boeing, NASA-Glenn, and the Georgia Institute of Technology, will develop an end-to-end modeling tool for rapid architecture trade-offs of high-data-rate laser communications from lunar, martian, and outer planetary ranges. An objective of the modeling tool is to reduce the inefficient reliance on modeling of discrete subsystems or sequential development of multiple expensive and time-consuming hardware units, thereby saving significant cost and time. This dynamic, time-domain modeling tool will accept measured component and subsystem data inputs and generate "difficult to measure" characteristics required for the performance evaluation of different designs and architectural choices. The planned modeling tool will incorporate actual subsystem performance data to reduce the develop-build-evaluate-refine production cycle. The list of high-level objectives of the program includes (1) development of a bidirectional global link analysis backbone software encompassing all optical communication subsystem parameters; (2) development of a bidirectional global link simulation model encompassing all optical communication parameters; (3) interoperability of the link analysis tool with all relevant detailed subsystem design models; and (4) a validated model that is validated against known experimental data at the subsystem and system levels.

  11. Status report of the end-to-end ASKAP software system: towards early science operations

    NASA Astrophysics Data System (ADS)

    Guzman, Juan Carlos; Chapman, Jessica; Marquarding, Malte; Whiting, Matthew

    2016-08-01

    300 MHz bandwidth for Array Release 1; followed by the deployment of the real-time data processing components. In addition to the Central Processor, the first production release of the CSIRO ASKAP Science Data Archive (CASDA) has also been deployed in one of the Pawsey Supercomputing Centre facilities and it is integrated to the end-to-end ASKAP data flow system. This paper describes the current status of the "end-to-end" data flow software system from preparing observations to data acquisition, processing and archiving; and the challenges of integrating an HPC facility as a key part of the instrument. It also shares some lessons learned since the start of integration activities and the challenges ahead in preparation for the start of the Early Science program.

  12. End-to-end self-assembly of RADA 16-I nanofibrils in aqueous solutions.

    PubMed

    Arosio, Paolo; Owczarz, Marta; Wu, Hua; Butté, Alessandro; Morbidelli, Massimo

    2012-04-04

    RADARADARADARADA (RADA 16-I) is a synthetic amphiphilic peptide designed to self-assemble in a controlled way into fibrils and higher ordered structures depending on pH. In this work, we use various techniques to investigate the state of the peptide dispersed in water under dilute conditions at different pH and in the presence of trifluoroacetic acid or hydrochloric acid. We have identified stable RADA 16-I fibrils at pH 2.0-4.5, which have a length of ∼200-400 nm and diameter of 10 nm. The fibrils have the characteristic antiparallel β-sheet structure of amyloid fibrils, as measured by circular dichroism and Fourier transform infrared spectrometry. During incubation at pH 2.0-4.5, the fibrils elongate very slowly via an end-to-end fibril-fibril aggregation mechanism, without changing their diameter, and the kinetics of such aggregation depends on pH and anion type. At pH 2.0, we also observed a substantial amount of monomers in the system, which do not participate in the fibril elongation and degrade to fragments. The fibril-fibril elongation kinetics has been simulated using the Smoluchowski kinetic model, population balance equations, and the simulation results are in good agreement with the experimental data. It is also found that the aggregation process is not limited by diffusion but rather is an activated process with energy barrier in the order of 20 kcal/mol.

  13. An optimized end-to-end process for the analysis of agile earth observation satellite missions

    NASA Astrophysics Data System (ADS)

    Hahn, M.; Müller, T.; Levenhagen, J.

    2014-12-01

    Agile earth observation satellite missions are becoming more and more important due to their capability to perform fast reorientation maneuvers with 3 degrees of freedom to capture different target areas along the orbital path, thus increasing the observed area and complexity of scans. The design of an agile earth observation satellite mission is a non-trivial task due to the fact that a trade-off between observed area and complexity of the scans on the one hand and degree of agility available and thus performance of the attitude control devices on the other hand has to be done. Additionally, the designed mission has to be evaluated in a realistic environment also taking into account the specific characteristics of the chosen actuators. In the present work, several methods are combined to provide an integrated analysis of agile earth observation satellite missions starting from the definition of a desired ground scan scenario, going via the creation of a guidance profile to a realistic simulation and ending at the verification of the feasibility by detailed closed-loop simulation. Regarding its technical implementation at Astrium GmbH, well-proven tools for the different tasks of the analysis are incorporated and well defined interfaces for those tools are specified, allowing a high degree of automatism and thus saving time and minimizing errors. This results in a complete end-to-end process for the design, analysis and verification of agile earth observation satellite missions. This process is demonstrated by means of an example analysis using control moment gyros for a high agility mission.

  14. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    PubMed

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  15. End-to-End Self-Assembly of RADA 16-I Nanofibrils in Aqueous Solutions

    PubMed Central

    Arosio, Paolo; Owczarz, Marta; Wu, Hua; Butté, Alessandro; Morbidelli, Massimo

    2012-01-01

    RADARADARADARADA (RADA 16-I) is a synthetic amphiphilic peptide designed to self-assemble in a controlled way into fibrils and higher ordered structures depending on pH. In this work, we use various techniques to investigate the state of the peptide dispersed in water under dilute conditions at different pH and in the presence of trifluoroacetic acid or hydrochloric acid. We have identified stable RADA 16-I fibrils at pH 2.0–4.5, which have a length of ∼200–400 nm and diameter of 10 nm. The fibrils have the characteristic antiparallel β-sheet structure of amyloid fibrils, as measured by circular dichroism and Fourier transform infrared spectrometry. During incubation at pH 2.0–4.5, the fibrils elongate very slowly via an end-to-end fibril-fibril aggregation mechanism, without changing their diameter, and the kinetics of such aggregation depends on pH and anion type. At pH 2.0, we also observed a substantial amount of monomers in the system, which do not participate in the fibril elongation and degrade to fragments. The fibril-fibril elongation kinetics has been simulated using the Smoluchowski kinetic model, population balance equations, and the simulation results are in good agreement with the experimental data. It is also found that the aggregation process is not limited by diffusion but rather is an activated process with energy barrier in the order of 20 kcal/mol. PMID:22500762

  16. SPoRT - An End-to-End R2O Activity

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  17. End-To-END Performance of the Future MOMA Instrument Aboard the ExoMars Mission

    NASA Astrophysics Data System (ADS)

    Pinnick, V. T.; Buch, A.; Szopa, C.; Grand, N.; Danell, R.; Grubisic, A.; van Amerom, F. H. W.; Glavin, D. P.; Freissinet, C.; Coll, P. J.; Stalport, F.; Humeau, O.; Arevalo, R. D., Jr.; Brinckerhoff, W. B.; Steininger, H.; Goesmann, F.; Raulin, F.; Mahaffy, P. R.

    2015-12-01

    Following the SAM experiment aboard the Curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the 2018 ExoMars mission will be the continuation of the search for organic matter on the Mars surface. One advancement with the ExoMars mission is that the sample will be extracted as deep as 2 meters below the Martian surface to minimize effects of radiation and oxidation on organic materials. To analyze the wide range of organic composition (volatile and non-volatile compounds) of the Martian soil, MOMA is equipped with a dual ion source ion trap mass spectrometer utilizing UV laser desorption / ionization (LDI) and pyrolysis gas chromatography (pyr-GC). In order to analyze refractory organic compounds and chiral molecules during GC-ITMS analysis, samples may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). Previous experimental reports have focused on coupling campaigns between the breadboard versions of the GC, provided by the French team (LISA, LATMOS, CentraleSupelec), and the MS, provided by the US team (NASA-GSFC). This work focuses on the performance verification and optimization of the GC-ITMS experiment using the Engineering Test Unit (ETU) models which are representative of the form, fit and function of the flight instrument including a flight-like pyrolysis oven and tapping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References: [1] Buch, A. et al. (2009) J Chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459.

  18. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    NASA Astrophysics Data System (ADS)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  19. SU-E-T-150: End to End Tests On the First Clinical EDGETM

    SciTech Connect

    Scheib, S; Schmelzer, P; Vieira, S; Greco, C

    2014-06-01

    Purpose: To quantify the sub millimeter overall accuracy of EDGETM, the dedicated linac based SRS/SABR treatment platform from Varian, using a novel End-to-End (E2E) test phantom. Methods: The new E2E test phantom developed by Varian consists of a cube with an outer dimension of 15x15x15 cm3. The phantom is equipped with an exchangable inner cube (7×7×7 cm3) to hold radiochromic films or a tungsten ball (diameter = 5 mm) for Winston-Lutz tests. 16 ceramic balls (diameter = 5 mm) are embedded in the outer cube. Three embedded Calypso transponders allow for Calypso based monitoring. The outer surface of the phantom is tracked using the Optical Surface Monitoring System (OSMS). The phantom is positioned using kV, MV and CBCT images. A simCT of the phantom was acquired and SRS/SABR plans were treated using the new phantom on the first clinical installed EDGETM. As a first step a series of EPID based Winston-Lutz tests have been performed. As a second step the calculated dose distribution applied to the phantom was verified with radiochromic films in orthogonal planes. The measured dose distribution is compared with the calculated (Eclipse) one based on the known isocenter on both dose distributions. The geometrical shift needed to match both dose distributions is the overall accuracy and is determined using dose profiles, isodose lines or gamma pass rates (3%, 1 mm). Results: Winston-Lutz tests using the central tungsten BB demonstrated a targeting accuracy of 0.44±0.18mm for jaw (2cm × 2cm) defined 0.39±0.19mm for MLC (2cm × 2cm) defined and 0.37±0.15mm for cone (12.5 mm) defined fields. A treated patient plan (spinal metastases lesion with integrated boost) showed a dosimetric dose localization accuracy of 0.6mm. Conclusion: Geometric and dosimetric E2E tests on EDGETM, show sub-millimeter E2E targeting and dose localisation accuracy.

  20. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  1. Identifying Elusive Electromagnetic Counterparts to Gravitational Wave Mergers: An End-to-end Simulation

    NASA Astrophysics Data System (ADS)

    Nissanke, Samaya; Kasliwal, Mansi; Georgieva, Alexandra

    2013-04-01

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg2 (or 6-65 deg2) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies that would

  2. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  3. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    SciTech Connect

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-04-20

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg{sup 2} (or 6-65 deg{sup 2}) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies

  4. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  5. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services Over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  6. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    NASA Astrophysics Data System (ADS)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  7. On the importance of risk knowledge for an end-to-end tsunami early warning system

    NASA Astrophysics Data System (ADS)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  8. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  9. Dynamic Hop Service Differentiation Model for End-to-End QoS Provisioning in Multi-Hop Wireless Networks

    NASA Astrophysics Data System (ADS)

    Youn, Joo-Sang; Seok, Seung-Joon; Kang, Chul-Hee

    This paper presents a new QoS model for end-to-end service provisioning in multi-hop wireless networks. In legacy IEEE 802.11e based multi-hop wireless networks, the fixed assignment of service classes according to flow's priority at every node causes priority inversion problem when performing end-to-end service differentiation. Thus, this paper proposes a new QoS provisioning model called Dynamic Hop Service Differentiation (DHSD) to alleviate the problem and support effective service differentiation between end-to-end nodes. Many previous works for QoS model through the 802.11e based service differentiation focus on packet scheduling on several service queues with different service rate and service priority. Our model, however, concentrates on a dynamic class selection scheme, called Per Hop Class Assignment (PHCA), in the node's MAC layer, which selects a proper service class for each packet, in accordance with queue states and service requirement, in every node along the end-to-end route of the packet. The proposed QoS solution is evaluated using the OPNET simulator. The simulation results show that the proposed model outperforms both best-effort and 802.11e based strict priority service models in mobile ad hoc environments.

  10. Integration proposal through standard-based design of an end-to-end platform for p-Health environments.

    PubMed

    Martíínez, I; Trigo, J D; Martínez-Espronceda, M; Escayola, J; Muñoz, P; Serrano, L; García, J

    2009-01-01

    Interoperability among medical devices and compute engines in the personal environment of the patient, and with healthcare information systems in the remote monitoring and management process is a key need that requires developments supported on standard-based design. Even though there have been some international initiatives to combine different standards, the vision of an entire end-to-end standard-based system is the next challenge. This paper presents the implementation guidelines of a ubiquitous platform for Personal Health (p-Health). It is standard-based using the two main medical norms in this context: ISO/IEEE11073 in the patient environment for medical device interoperability, and EN13606 to allow the interoperable communication of the Electronic Healthcare Record of the patient. Furthermore, the proposal of a new protocol for End-to-End Standard Harmonization (E2ESHP) is presented in order to make possible the end-to-end standard integration. The platform has been designed to comply with the last ISO/IEEE11073 and EN13606 available versions, and tested in a laboratory environment as a proof-of-concept to illustrate its feasibility as an end-to-end standard-based solution.

  11. A vision for end-to-end data services to foster international partnerships through data sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  12. A high resolution spectrum reconstruction algorithm using compressive sensing theory

    NASA Astrophysics Data System (ADS)

    Zheng, Zhaoyu; Liang, Dakai; Liu, Shulin; Feng, Shuqing

    2015-07-01

    This paper proposes a quick spectrum scanning and reconstruction method using compressive sensing in composite structure. The strain field of corrugated structure is simulated by finite element analysis. Then the reflect spectrum is calculated using an improved transfer matrix algorithm. The K-means singular value decomposition sparse dictionary is trained . In the test the spectrum with limited sample points can be obtained and the high resolution spectrum is reconstructed by solving sparse representation equation. Compared with the other conventional basis, the effect of this method is better. The match rate of the recovered spectrum and the original spectrum is over 95%.

  13. Exploring the requirements for multimodal interaction for mobile devices in an end-to-end journey context.

    PubMed

    Krehl, Claudia; Sharples, Sarah

    2012-01-01

    The paper investigates the requirements for multimodal interaction on mobile devices in an end-to-end journey context. Traditional interfaces are deemed cumbersome and inefficient for exchanging information with the user. Multimodal interaction provides a different user-centred approach allowing for more natural and intuitive interaction between humans and computers. It is especially suitable for mobile interaction as it can overcome additional constraints including small screens, awkward keypads, and continuously changing settings - an inherent property of mobility. This paper is based on end-to-end journeys where users encounter several contexts during their journeys. Interviews and focus groups explore the requirements for multimodal interaction design for mobile devices by examining journey stages and identifying the users' information needs and sources. Findings suggest that multimodal communication is crucial when users multitask. Choosing suitable modalities depend on user context, characteristics and tasks.

  14. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  15. End-to-end average BER analysis for multihop free-space optical communications with pointing errors

    NASA Astrophysics Data System (ADS)

    Sheng, Ming; Jiang, Peng; Hu, Qingsong; Su, Qin; Xie, Xiu-xiu

    2013-05-01

    This paper addresses the end-to-end average BER (ABER) performance of decode-and-forward (DF) relay free-space optical (FSO) communications over weak and strong turbulence channels with pointing errors. For the weak and strong turbulence channels, the probability distribution function (PDF) of the irradiance can be modeled by a lognormal and Gamma-Gamma distribution, respectively. Considering the effects from atmospheric attenuation, turbulence and pointing errors, we present a statistical model for the optical intensity fluctuation at the receiver. Then the end-to-end ABER performances are analyzed and derived closed-form expressions are obtained. The simulation results indicate that the derived closed-form expressions provide sufficiently accurate approximations.

  16. End-to-End Demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power Conversion and Ion Engine Operation

    NASA Technical Reports Server (NTRS)

    Hrbud, Ivana; VanDyke, Melissa; Houts, Mike; Goodfellow, Keith; Schafer, Charles (Technical Monitor)

    2001-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  17. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    ERIC Educational Resources Information Center

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  18. Debris mitigation measures by satellite design and operational methods - Findings from the DLR space debris End-to-End Service

    NASA Astrophysics Data System (ADS)

    Sdunnus, H.; Beltrami, P.; Janovsky, R.; Koppenwallner, G.; Krag, H.; Reimerdes, H.; Schäfer, F.

    Debris Mitigation has been recognised as an issue to be addressed by the space faring nations around the world. Currently, there are various activities going on, aiming at the establishment of debris mitigation guidelines on various levels, reaching from the UN down to national space agencies. Though guidelines established on the national level already provide concrete information how things should be done (rather that specifying what should be done or providing fundamental principles) potential users of the guidelines will still have the need to explore the technical, management, and financial implications of the guidelines for their projects. Those questions are addressed by the so called "Space Debris End-to-End Service" project, which has been initiated as a national initiative of the German Aerospace Centre (DLR). Based on a review of already existing mitigation guidelines or guidelines under development and following an identification of needs from a circle of industrial users the "End-to-End Service Gu idelines" have been established for designer and operators of spacecraft. The End-to-End Service Guidelines are based on requirements addressed by the mitigation guidelines and provide recommendations how and when the technical consideration of the mitigation guidelines should take place. By referencing requirements from the mitigation guidelines, the End-to-End Service Guidelines address the consideration of debris mitigation measures by spacecraft design and operational measures. This paper will give an introduction to the End-to-End Service Guidelines. It will focus on the proposals made for mitigation measures by the S/C system design, i.e. on protective design measures inside the spacecraft and on design measures, e.g. innovative protective (shielding) systems. Furthermore, approaches on the analytical optimisation of protective systems will be presented, aiming at the minimisation of shield mass under conservation of the protective effects. On the

  19. Spread Spectrum Visual Sensor Network Resource Management Using an End-to-End Cross-Layer Design

    DTIC Science & Technology

    2011-02-01

    APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. PA Case Number: WPAFB-08- 3693 DATE CLEARED: 11 JUNE 2008 13. SUPPLEMENTARY NOTES © 2011 IEEE...Elizabeth S. Bentley, Lisimachos P. Kondi, Member, IEEE, John D . Matyjas, Michael J. Medley, Senior Member, IEEE, and Bruce W. Suter Abstract—In this...was Dr. Qian Zhang. E. S. Bentley, J. D . Matyjas, M. J. Medley, and B. W. Suter are with the Air Force Research Laboratory, Rome, NY 13441 USA. L. P

  20. JADS JT&E: Phase 3 and Phase 4 Verification and Validation Report for the End-to-End Test

    DTIC Science & Technology

    1999-03-01

    Plan for the ETE Test and the Phase 3 Verification and Validation Plan for the End-to-End Test 1.2 Verification and Validation Tasks The V&V tasks...performed on 23 Ferumary and 13 March 1999 during Phase 3 were conducted on the T3 aircraft parked on the ramp and are described in the Phase 3 Verification...properly was to perform an abbreviated version of the Phase 3 V&V. There were also two V&V tasks that were either not completed or were not resolved when

  1. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  2. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Cusp

    NASA Technical Reports Server (NTRS)

    Coffey, V. N.; Chandler, M. O.; Singh, Nagendra; Avanov, Levon

    2005-01-01

    This paper describes a study of the effects of unstable magnetosheath distributions on the cusp ionosphere. An end-to-end numerical model was used to study, first, the evolved distributions from precipitation due to reconnection and, secondly, the energy transfer into the high latitude ionosphere based on these solar wind/magnetosheath inputs. Using inputs of several representative examples of magnetosheath injections, waves were generated at the lower hybrid frequency and energy transferred to the ionospheric electrons and ions. The resulting wave spectra and ion and electron particle heating was analyzed. Keywords: Ion heating: Magnetosheath/Ionosphere coupling: Particle/Wave Interactions. Simulations

  3. End-to-end testing. [to verify electrical equipment failure due to carbon fibers released in aircraft-fuel fires

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1979-01-01

    The principle objective of the kinds of demonstration tests that are discussed is to try to verify whether or not carbon fibers that are released by burning composite parts in an aircraft-fuel fires can produce failures in electrical equipment. A secondary objective discussed is to experimentally validate the analytical models for some of the key elements in the risk analysis. The approach to this demonstration testing is twofold: limited end-to-end test are to be conducted in a shock tube; and planning for some large outdoor burn tests is being done.

  4. End-to-end self-assembly of gold nanorods in isopropanol solution: experimental and theoretical studies

    NASA Astrophysics Data System (ADS)

    Gordel, M.; Piela, K.; Kołkowski, R.; Koźlecki, T.; Buckle, M.; Samoć, M.

    2015-12-01

    We describe here a modification of properties of colloidal gold nanorods (NRs) resulting from the chemical treatment used to carry out their transfer into isopropanol (IPA) solution. The NRs acquire a tendency to attach one to another by their ends (end-to-end assembly). We focus on the investigation of the change in position and shape of the longitudinal surface plasmon (l-SPR) band after self-assembly. The experimental results are supported by a theoretical calculation, which rationalizes the dramatic change in optical properties when the NRs are positioned end-to-end at short distances. The detailed spectroscopic characterization performed at the consecutive stages of transfer of the NRs from water into IPA solution revealed the features of the interaction between the polymers used as ligands and their contribution to the final stage, when the NRs were dispersed in IPA solution. The efficient method of aligning the NRs detailed here may facilitate applications of the self-assembled NRs as building blocks for optical materials and biological sensing.

  5. Context-driven, prescription-based personal activity classification: methodology, architecture, and end-to-end implementation.

    PubMed

    Xu, James Y; Chang, Hua-I; Chien, Chieh; Kaiser, William J; Pottie, Gregory J

    2014-05-01

    Enabling large-scale monitoring and classification of a range of motion activities is of primary importance due to the need by healthcare and fitness professionals to monitor exercises for quality and compliance. Past work has not fully addressed the unique challenges that arise from scaling. This paper presents a novel end-to-end system solution to some of these challenges. The system is built on the prescription-based context-driven activity classification methodology. First, we show that by refining the definition of context, and introducing the concept of scenarios, a prescription model can provide personalized activity monitoring. Second, through a flexible architecture constructed from interface models, we demonstrate the concept of a context-driven classifier. Context classification is achieved through a classification committee approach, and activity classification follows by means of context specific activity models. Then, the architecture is implemented in an end-to-end system featuring an Android application running on a mobile device, and a number of classifiers as core classification components. Finally, we use a series of experimental field evaluations to confirm the expected benefits of the proposed system in terms of classification accuracy, rate, and sensor operating life.

  6. Effect of swirling flow on platelet concentration distribution in small-caliber artificial grafts and end-to-end anastomoses

    NASA Astrophysics Data System (ADS)

    Zhan, Fan; Fan, Yu-Bo; Deng, Xiao-Yan

    2011-10-01

    Platelet concentration near the blood vessel wall is one of the major factors in the adhesion of platelets to the wall. In our previous studies, it was found that swirling flows could suppress platelet adhesion in small-caliber artificial grafts and end-to-end anastomoses. In order to better understand the beneficial effect of the swirling flow, we numerically analyzed the near-wall concentration distribution of platelets in a straight tube and a sudden tubular expansion tube under both swirling flow and normal flow conditions. The numerical models were created based on our previous experimental studies. The simulation results revealed that when compared with the normal flow, the swirling flow could significantly reduce the near-wall concentration of platelets in both the straight tube and the expansion tube. The present numerical study therefore indicates that the reduction in platelet adhesion under swirling flow conditions in small-caliber arterial grafts, or in end-to-end anastomoses as observed in our previous experimental study, was possibly through a mechanism of platelet transport, in which the swirling flow reduced the near-wall concentration of platelets.

  7. Far-Infrared Therapy Promotes Nerve Repair following End-to-End Neurorrhaphy in Rat Models of Sciatic Nerve Injury

    PubMed Central

    Chen, Tai-Yuan; Yang, Yi-Chin; Sha, Ya-Na; Chou, Jiun-Rou

    2015-01-01

    This study employed a rat model of sciatic nerve injury to investigate the effects of postoperative low-power far-infrared (FIR) radiation therapy on nerve repair following end-to-end neurorrhaphy. The rat models were divided into the following 3 groups: (1) nerve injury without FIR biostimulation (NI/sham group); (2) nerve injury with FIR biostimulation (NI/FIR group); and (3) noninjured controls (normal group). Walking-track analysis results showed that the NI/FIR group exhibited significantly higher sciatic functional indices at 8 weeks after surgery (P < 0.05) compared with the NI/sham group. The decreased expression of CD4 and CD8 in the NI/FIR group indicated that FIR irradiation modulated the inflammatory process during recovery. Compared with the NI/sham group, the NI/FIR group exhibited a significant reduction in muscle atrophy (P < 0.05). Furthermore, histomorphometric assessment indicated that the nerves regenerated more rapidly in the NI/FIR group than in the NI/sham group; furthermore, the NI/FIR group regenerated neural tissue over a larger area, as well as nerve fibers of greater diameter and with thicker myelin sheaths. Functional recovery, inflammatory response, muscular reinnervation, and histomorphometric assessment all indicated that FIR radiation therapy can accelerate nerve repair following end-to-end neurorrhaphy of the sciatic nerve. PMID:25722734

  8. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  9. Influence of DBT reconstruction algorithm on power law spectrum coefficient

    NASA Astrophysics Data System (ADS)

    Vancamberg, Laurence; Carton, Ann-Katherine; Abderrahmane, Ilyes H.; Palma, Giovanni; Milioni de Carvalho, Pablo; Iordache, Rǎzvan; Muller, Serge

    2015-03-01

    In breast X-ray images, texture has been characterized by a noise power spectrum (NPS) that has an inverse power-law shape described by its slope β in the log-log domain. It has been suggested that the magnitude of the power-law spectrum coefficient β is related to mass lesion detection performance. We assessed β in reconstructed digital breast tomosynthesis (DBT) images to evaluate its sensitivity to different typical reconstruction algorithms including simple back projection (SBP), filtered back projection (FBP) and a simultaneous iterative reconstruction algorithm (SIRT 30 iterations). Results were further compared to the β coefficient estimated from 2D central DBT projections. The calculations were performed on 31 unilateral clinical DBT data sets and simulated DBT images from 31 anthropomorphic software breast phantoms. Our results show that β highly depends on the reconstruction algorithm; the highest β values were found for SBP, followed by reconstruction with FBP, while the lowest β values were found for SIRT. In contrast to previous studies, we found that β is not always lower in reconstructed DBT slices, compared to 2D projections and this depends on the reconstruction algorithm. All β values estimated in DBT slices reconstructed with SBP were larger than β values from 2D central projections. Our study also shows that the reconstruction algorithm affects the symmetry of the breast texture NPS; the NPS of clinical cases reconstructed with SBP exhibit the highest symmetry, while the NPS of cases reconstructed with SIRT exhibit the highest asymmetry.

  10. Performances of the fractal iterative method with an internal model control law on the ESO end-to-end ELT adaptive optics simulator

    NASA Astrophysics Data System (ADS)

    Béchet, C.; Le Louarn, M.; Tallon, M.; Thiébaut, É.

    2008-07-01

    Adaptive Optics systems under study for the Extremely Large Telescopes gave rise to a new generation of algorithms for both wavefront reconstruction and the control law. In the first place, the large number of controlled actuators impose the use of computationally efficient methods. Secondly, the performance criterion is no longer solely based on nulling residual measurements. Priors on turbulence must be inserted. In order to satisfy these two requirements, we suggested to associate the Fractal Iterative Method for the estimation step with an Internal Model Control. This combination has now been tested on an end-to-end adaptive optics numerical simulator at ESO, named Octopus. Results are presented here and performance of our method is compared to the classical Matrix-Vector Multiplication combined with a pure integrator. In the light of a theoretical analysis of our control algorithm, we investigate the influence of several errors contributions on our simulations. The reconstruction error varies with the signal-to-noise ratio but is limited by the use of priors. The ratio between the system loop delay and the wavefront coherence time also impacts on the reachable Strehl ratio. Whereas no instabilities are observed, correction quality is obviously affected at low flux, when subapertures extinctions are frequent. Last but not least, the simulations have demonstrated the robustness of the method with respect to sensor modeling errors and actuators misalignments.

  11. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    PubMed Central

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  12. The MARS pathfinder end-to-end information system: A pathfinder for the development of future NASA planetary missions

    NASA Technical Reports Server (NTRS)

    Cook, Richard A.; Kazz, Greg J.; Tai, Wallace S.

    1996-01-01

    The development of the Mars pathfinder is considered with emphasis on the End-to-End Information System (EEIS) development approach. The primary mission objective is to successfully develop and deliver a single flight system to the Martian surface, demonstrating entry, descent and landing. The EEIS is a set of functions distributed throughout the flight, ground and Mission Operation Systems (MOS) that inter-operate in order to control, collect, transport, process, store and analyze the uplink and downlink information flows of the mission. Coherence between the mission systems is achieved though the EEIS architecture. The key characteristics of the system are: a concurrent engineering approach for the development of flight, ground and mission operation systems; the fundamental EEIS architectural heuristics; a phased incremental EEIS development and test approach, and an EEIS design deploying flight, ground and MOS operability features, including integrated ground and flight based toolsets.

  13. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    NASA Technical Reports Server (NTRS)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  14. NASA End-to-End Data System /NEEDS/ information adaptive system - Performing image processing onboard the spacecraft

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Howle, W. M.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  15. End-To-End Risk Assesment: From Genes and Protein to Acceptable Radiation Risks for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Schimmerling, Walter

    2000-01-01

    The human exploration of Mars will impose unavoidable health risks from galactic cosmic rays (GCR) and possibly solar particle events (SPE). It is the goal of NASA's Space Radiation Health Program to develop the capability to predict health risks with significant accuracy to ensure that risks are well below acceptable levels and to allow for mitigation approaches to be effective at reasonable costs. End-to-End risk assessment is the approach being followed to understand proton and heavy ion damage at the molecular, cellular, and tissue levels in order to predict the probability of the major health risk including cancer, neurological disorders, hereditary effects, cataracts, and acute radiation sickness and to develop countermeasures for mitigating risks.

  16. SU-E-T-282: Dose Measurements with An End-To-End Audit Phantom for Stereotactic Radiotherapy

    SciTech Connect

    Jones, R; Artschan, R; Thwaites, D; Lehmann, J

    2015-06-15

    Purpose: Report on dose measurements as part of an end-to-end test for stereotactic radiotherapy, using a new audit tool, which allows audits to be performed efficiently either by an onsite team or as a postal audit. Methods: Film measurements have been performed with a new Stereotactic Cube Phantom. The phantom has been designed to perform Winston Lutz type position verification measurements and dose measurements in one setup. It comprises a plastic cube with a high density ball in its centre (used for MV imaging with film or EPID) and low density markers in the periphery (used for Cone Beam Computed Tomography, CBCT imaging). It also features strategically placed gold markers near the posterior and right surfaces, which can be used to calculate phantom rotations on MV images. Slit-like openings allow insertion of film or other detectors.The phantom was scanned and small field treatment plans were created. The fields do not traverse any inhomogeneities of the phantom on their paths to the measurement location. The phantom was setup at the delivery system using CBCT imaging. The calculated treatment fields were delivered, each with a piece of radiochromic film (EBT3) placed in the anterior film holder of the phantom. MU had been selected in planning to achieve similar exposures on all films. Calibration films were exposed in solid water for dose levels around the expected doses. Films were scanned and analysed following established procedures. Results: Setup of the cube showed excellent suitability for CBCT 3D alignment. MV imaging with EPID allowed for clear identification of all markers. Film based dose measurements showed good agreement for MLC created fields down to 0.5 mm × 0.5 mm. Conclusion: An end-to-end audit phantom for stereotactic radiotherapy has been developed and tested.

  17. Model selection applied to reconstruction of the Primordial Power Spectrum

    NASA Astrophysics Data System (ADS)

    Vázquez, J. Alberto; Bridges, M.; Hobson, M. P.; Lasenby, A. N.

    2012-06-01

    The preferred shape for the primordial spectrum of curvature perturbations is determined by performing a Bayesian model selection analysis of cosmological observations. We first reconstruct the spectrum modelled as piecewise linear in log k between nodes in k-space whose amplitudes and positions are allowed to vary. The number of nodes together with their positions are chosen by the Bayesian evidence, so that we can both determine the complexity supported by the data and locate any features present in the spectrum. In addition to the node-based reconstruction, we consider a set of parameterised models for the primordial spectrum: the standard power-law parameterisation, the spectrum produced from the Lasenby & Doran (LD) model and a simple variant parameterisation. By comparing the Bayesian evidence for different classes of spectra, we find the power-law parameterisation is significantly disfavoured by current cosmological observations, which show a preference for the LD model.

  18. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  19. Monte Carlo Estimate to Improve Photon Energy Spectrum Reconstruction

    NASA Astrophysics Data System (ADS)

    Sawchuk, S.

    Improvements to planning radiation treatment for cancer patients and quality control of medical linear accelerators (linacs) can be achieved with the explicit knowledge of the photon energy spectrum. Monte Carlo (MC) simulations of linac treatment heads and experimental attenuation analysis are among the most popular ways of obtaining these spectra. Attenuation methods which combine measurements under narrow beam geometry and the associated calculation techniques to reconstruct the spectrum from the acquired data are very practical in a clinical setting and they can also serve to validate MC simulations. A novel reconstruction method [1] which has been modified [2] utilizes a Simpson's rule (SR) to approximate and discretize (1)

  20. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  1. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  2. An end-to-end software solution for the analysis of high-throughput single-cell migration data

    PubMed Central

    Masuzzo, Paola; Huyck, Lynn; Simiczyjew, Aleksandra; Ampe, Christophe; Martens, Lennart; Van Troys, Marleen

    2017-01-01

    The systematic study of single-cell migration requires the availability of software for assisting data inspection, quality control and analysis. This is especially important for high-throughput experiments, where multiple biological conditions are tested in parallel. Although the field of cell migration can count on different computational tools for cell segmentation and tracking, downstream data visualization, parameter extraction and statistical analysis are still left to the user and are currently not possible within a single tool. This article presents a completely new module for the open-source, cross-platform CellMissy software for cell migration data management. This module is the first tool to focus specifically on single-cell migration data downstream of image processing. It allows fast comparison across all tested conditions, providing automated data visualization, assisted data filtering and quality control, extraction of various commonly used cell migration parameters, and non-parametric statistical analysis. Importantly, the module enables parameters computation both at the trajectory- and at the step-level. Moreover, this single-cell analysis module is complemented by a new data import module that accommodates multiwell plate data obtained from high-throughput experiments, and is easily extensible through a plugin architecture. In conclusion, the end-to-end software solution presented here tackles a key bioinformatics challenge in the cell migration field, assisting researchers in their high-throughput data processing. PMID:28205527

  3. Scaffold-integrated microchips for end-to-end in vitro tumor cell attachment and xenograft formation

    PubMed Central

    Lee, Jungwoo; Kohl, Nathaniel; Shanbhang, Sachin; Parekkadan, Biju

    2015-01-01

    Microfluidic technologies have substantially advanced cancer research by enabling the isolation of rare circulating tumor cells (CTCs) for diagnostic and prognostic purposes. The characterization of isolated CTCs has been limited due to the difficulty in recovering and growing isolated cells with high fidelity. Here, we present a strategy that uses a 3D scaffold, integrated into a microfludic device, as a transferable substrate that can be readily isolated after device operation for serial use in vivo as a transplanted tissue bed. Hydrogel scaffolds were incorporated into a PDMS fluidic chamber prior to bonding and were rehydrated in the chamber after fluid contact. The hydrogel matrix completely filled the fluid chamber, significantly increasing the surface area to volume ratio, and could be directly visualized under a microscope. Computational modeling defined different flow and pressure regimes that guided the conditions used to operate the chip. As a proof of concept using a model cell line, we confirmed human prostate tumor cell attachment in the microfluidic scaffold chip, retrieval of the scaffold en masse, and serial implantation of the scaffold to a mouse model with preserved xenograft development. With further improvement in capture efficiency, this approach can offer an end-to-end platform for the continuous study of isolated cancer cells from a biological fluid to a xenograft in mice. PMID:26709385

  4. Scaffold-integrated microchips for end-to-end in vitro tumor cell attachment and xenograft formation.

    PubMed

    Lee, Jungwoo; Kohl, Nathaniel; Shanbhang, Sachin; Parekkadan, Biju

    2015-12-01

    Microfluidic technologies have substantially advanced cancer research by enabling the isolation of rare circulating tumor cells (CTCs) for diagnostic and prognostic purposes. The characterization of isolated CTCs has been limited due to the difficulty in recovering and growing isolated cells with high fidelity. Here, we present a strategy that uses a 3D scaffold, integrated into a microfludic device, as a transferable substrate that can be readily isolated after device operation for serial use in vivo as a transplanted tissue bed. Hydrogel scaffolds were incorporated into a PDMS fluidic chamber prior to bonding and were rehydrated in the chamber after fluid contact. The hydrogel matrix completely filled the fluid chamber, significantly increasing the surface area to volume ratio, and could be directly visualized under a microscope. Computational modeling defined different flow and pressure regimes that guided the conditions used to operate the chip. As a proof of concept using a model cell line, we confirmed human prostate tumor cell attachment in the microfluidic scaffold chip, retrieval of the scaffold en masse, and serial implantation of the scaffold to a mouse model with preserved xenograft development. With further improvement in capture efficiency, this approach can offer an end-to-end platform for the continuous study of isolated cancer cells from a biological fluid to a xenograft in mice.

  5. Hardware and Methods of the Optical End-to-End Test of the Far Ultraviolet Spectroscopic Explorer (FUSE)

    NASA Technical Reports Server (NTRS)

    Conard, Steven J.; Redman, Kevin W.; Barkhouser, Robert H.; McGuffey, Doug B.; Smee, Stephen; Ohl, Raymond G.; Kushner, Gary

    1999-01-01

    The Far Ultraviolet Spectroscopic Explorer (FUSE), currently being tested and scheduled for a 1999 launch, is an astrophysics satellite designed to provide high spectral resolving power (Lambda/(Delta)Lambda = 24,000-30,000) over the interval 90.5-118.7 nm. The FUSE optical path consists of four co-aligned, normal incidence, off-axis parabolic, primary mirrors which illuminate separate Rowland circle spectrograph channels equipped with holographic gratings and delay line microchannel plate detectors. We describe the hardware and methods used for the optical end-to-end test of the FUSE instrument during satellite integration and test. Cost and schedule constraints forced us to devise a simplified version of the planned optical test which occurred in parallel with satellite thermal-vacuum testing. The optical test employed a collimator assembly which consisted of four co-aligned, 15" Cassegrain telescopes which were positioned above the FUSE instrument, providing a collimated beam for each optical channel. A windowed UV light source, remotely adjustable in three axes, was mounted at the focal plane of each collimator. Problems with the UV light sources, including high F-number and window failures, were the only major difficulties encountered during the test. The test succeeded in uncovering a significant problem with the secondary structure used for the instrument closeout cavity and, furthermore, showed that the mechanical solution was successful. The hardware was also used extensively for simulations of science observations, providing both UV light for spectra and visible light for the fine error sensor camera.

  6. End-to-end and side-by-side assemblies of gold nanorods induced by dithiol poly(ethylene glycol)

    NASA Astrophysics Data System (ADS)

    Liu, Jinsheng; Kan, Caixia; Li, Yuling; Xu, Haiying; Ni, Yuan; Shi, Daning

    2014-06-01

    The assemblies of gold nanorods (Au NRs) exhibit unique properties distinct from the isolated Au NR. We report an effective and simple method for the end-to-end (E-E) and side-by-side (S-S) assemblies of Au NRs with a molecularly defined nanogap (1-2 nm) only in the presence of dithiol poly(ethylene glycol) (HS-PEG-SH). The assembled methods need neither the pH value adjustments nor the addition of other organic solvent. With increasing amount of dithiol molecules, assembled modes of Au NRs experience an interesting procedure, changing from E-E to S-S orientation. The experimental results indicate that when the concentration of HS-PEG-SH is less than 0.25 μM, electrostatic repulsion of positive-charged CTA+ is stronger than the affinity of the Au-S binding, resulting in the E-E oriented assembly. Otherwise, the S-S oriented mode is predominated. The current assembled method will be potentially useful for the optoelectronics and biomedical engineering.

  7. A novel end-to-end fault detection and localization protocol for wavelength-routed WDM networks

    NASA Astrophysics Data System (ADS)

    Zeng, Hongqing; Vukovic, Alex; Huang, Changcheng

    2005-09-01

    Recently the wavelength division multiplexing (WDM) networks are becoming prevalent for telecommunication networks. However, even a very short disruption of service caused by network faults may lead to high data loss in such networks due to the high date rates, increased wavelength numbers and density. Therefore, the network survivability is critical and has been intensively studied, where fault detection and localization is the vital part but has received disproportional attentions. In this paper we describe and analyze an end-to-end lightpath fault detection scheme in data plane with the fault notification in control plane. The endeavor is focused on reducing the fault detection time. In this protocol, the source node of each lightpath keeps sending hello packets to the destination node exactly following the path for data traffic. The destination node generates an alarm once a certain number of consecutive hello packets are missed within a given time period. Then the network management unit collects all alarms and locates the faulty source based on the network topology, as well as sends fault notification messages via control plane to either the source node or all upstream nodes along the lightpath. The performance evaluation shows such a protocol can achieve fast fault detection, and at the same time, the overhead brought to the user data by hello packets is negligible.

  8. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    NASA Astrophysics Data System (ADS)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  9. End-to-end simulation of high-contrast imaging systems: methods and results for the PICTURE mission family

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Hewasawam, Kuravi; Mendillo, Christopher B.; Cahoy, Kerri L.; Cook, Timothy A.; Finn, Susanna C.; Howe, Glenn A.; Kuchner, Marc J.; Lewis, Nikole K.; Marinan, Anne D.; Mawet, Dimitri; Chakrabarti, Supriya

    2015-09-01

    We describe a set of numerical approaches to modeling the performance of space flight high-contrast imaging payloads. Mission design for high-contrast imaging requires numerical wavefront error propagation to ensure accurate component specifications. For constructed instruments, wavelength and angle-dependent throughput and contrast models allow detailed simulations of science observations, allowing mission planners to select the most productive science targets. The PICTURE family of missions seek to quantify the optical brightness of scattered light from extrasolar debris disks via several high-contrast imaging techniques: sounding rocket (the Planet Imaging Concept Testbed Using a Rocket Experiment) and balloon flights of a visible nulling coronagraph, as well as a balloon flight of a vector vortex coronagraph (the Planetary Imaging Concept Testbed Using a Recoverable Experiment - Coronagraph, PICTURE-C). The rocket mission employs an on-axis 0.5m Gregorian telescope, while the balloon flights will share an unobstructed off-axis 0.6m Gregorian. This work details the flexible approach to polychromatic, end-to-end physical optics simulations used for both the balloon vector vortex coronagraph and rocket visible nulling coronagraph missions. We show the preliminary PICTURE-C telescope and vector vortex coronagraph design will achieve 10-8 contrast without post-processing as limited by realistic optics, but not considering polarization or low-order errors. Simulated science observations of the predicted warm ring around Epsilon Eridani illustrate the performance of both missions.

  10. Reconstruction of the primordial power spectrum from CMB data

    SciTech Connect

    Guo, Zong-Kuan; Zhang, Yuan-Zhong; Schwarz, Dominik J. E-mail: dschwarz@physik.uni-bielefeld.de

    2011-08-01

    Measuring the deviation from scale invariance of the primordial power spectrum is a critical test of inflation. In this paper we reconstruct the shape of the primordial power spectrum of curvature perturbations from the cosmic microwave background data, including the 7-year Wilkinson Microwave Anisotropy Probe data and the Atacama Cosmology Telescope 148 GHz data, by using a binning method of a cubic spline interpolation in log-log space. We find that the power-law spectrum is preferred by the data and that the Harrison-Zel'dovich spectrum is disfavored at 95% confidence level. These conclusions hold with and without allowing for tensor modes, however the simpler model without tensors is preferred by the data. We do not find evidence for a feature in the primordial power spectrum — in full agreement with generic predictions from cosmological inflation.

  11. Does the end-to-end venous anastomosis offer a functional advantage over the end-to-side venous anastomosis in high-output arteriovenous grafts?

    PubMed

    Fillinger, M F; Kerns, D B; Bruch, D; Reinitz, E R; Schwartz, R A

    1990-12-01

    This study explores the hemodynamics, mechanics, and biologic response of end-to-end versus end-to-side venous anastomoses in a canine arteriovenous graft model. Femoral polytetrafluoroethylene grafts were implanted bilaterally in a paired fashion (n = 22). Detailed local hemodynamic measurements were made by use of color Doppler ultrasound imaging at 1, 4, 8, and 12 weeks after implant. Measurements included volumetric flow rate and Doppler-derived spectral window (percent window) as a measure of turbulence. Amplitude and velocity of vessel wall movement were also measured. Volume of perivascular tissue vibration quantitated kinetic energy transfer through the vessel wall. Volumetric flow rate (end to end, 1013 +/- 70 ml/min; end to side, 1015 +/- 72 ml/min), percent window (end to end, 6.6% +/- 0.6%, end to side, 5.6% +/- 0.4%) and volume of perivascular tissue vibration (end to end, 19.6 +/- 1.2 ml, end to side, 16.3 +/- 1.8 ml) were statistically equivalent in the two graft types (end to end vs end to side p greater than 0.05). Both graft types developed venous intimal-medial thickening of a similar magnitude: end to end, 0.35 +/- 0.05 mm, end to side, 0.43 +/- 0.09 mm, normal vein 0.070 +/- 0.004 mm (analysis of variance [ANOVA] p less than 0.001, p less than 0.01 for end to end or end to side vs control, end to end vs end to side p greater than 0.05 by Student-Newman-Keuls test). The best correlations with venous intimal-medial thickening were obtained from inverse percent window (r = 0.84, p less than 0.001) and volume of perivascular tissue vibration (r = 0.68, p less than 0.001). In the end to end configuration the relative amplitude of venous wall movement decreased, and the relative velocity of wall motion increased over time. We conclude that in the circumstances of this high flow arteriovenous graft model the end-to-end venous anastomosis does not significantly differ from the end-to-side venous anastomosis in terms of flow stability, turbulence, or

  12. Results from Solar Reflective Band End-to-End Testing for VIIRS F1 Sensor Using T-SIRCUS

    NASA Technical Reports Server (NTRS)

    McIntire, Jeff; Moyer, David; McCarthy, James K.; DeLuccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce

    2011-01-01

    Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor on-orbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Fight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD BRF by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.

  13. End-to-end sensor simulation for spectral band selection and optimization with application to the Sentinel-2 mission.

    PubMed

    Segl, Karl; Richter, Rudolf; Küster, Theres; Kaufmann, Hermann

    2012-02-01

    An end-to-end sensor simulation is a proper tool for the prediction of the sensor's performance over a range of conditions that cannot be easily measured. In this study, such a tool has been developed that enables the assessment of the optimum spectral resolution configuration of a sensor based on key applications. It employs the spectral molecular absorption and scattering properties of materials that are used for the identification and determination of the abundances of surface and atmospheric constituents and their interdependence on spatial resolution and signal-to-noise ratio as a basis for the detailed design and consolidation of spectral bands for the future Sentinel-2 sensor. The developed tools allow the computation of synthetic Sentinel-2 spectra that form the frame for the subsequent twofold analysis of bands in the atmospheric absorption and window regions. One part of the study comprises the assessment of optimal spatial and spectral resolution configurations for those bands used for atmospheric correction, optimized with regard to the retrieval of aerosols, water vapor, and the detection of cirrus clouds. The second part of the study presents the optimization of thematic bands, mainly driven by the spectral characteristics of vegetation constituents and minerals. The investigation is performed for different wavelength ranges because most remote sensing applications require the use of specific band combinations rather than single bands. The results from the important "red-edge" and the "short-wave infrared" domains are presented. The recommended optimum spectral design predominantly confirms the sensor parameters given by the European Space Agency. The system is capable of retrieving atmospheric and geobiophysical parameters with enhanced quality compared to existing multispectral sensors. Minor spectral changes of single bands are discussed in the context of typical remote sensing applications, supplemented by the recommendation of a few new bands for

  14. COMPUTATIONAL SIMULATIONS DEMONSTRATE ALTERED WALL SHEAR STRESS IN AORTIC COARCTATION PATIENTS TREATED BY RESECTION WITH END-TO-END ANASTOMOSIS

    PubMed Central

    LaDisa, John F.; Dholakia, Ronak J.; Figueroa, C. Alberto; Vignon-Clementel, Irene E.; Chan, Frandics P.; Samyn, Margaret M.; Cava, Joseph R.; Taylor, Charles A.; Feinstein, Jeffrey A.

    2011-01-01

    Background Atherosclerotic plaque in the descending thoracic aorta (dAo) is related to altered wall shear stress (WSS) for normal patients. Resection with end-to-end anastomosis (RWEA) is the gold standard for coarctation of the aorta (CoA) repair, but may lead to altered WSS indices that contribute to morbidity. Methods Computational fluid dynamics (CFD) models were created from imaging and blood pressure data for control subjects and age- and gender-matched CoA patients treated by RWEA (4 male, 2 female, 15±8 years). CFD analysis incorporated downstream vascular resistance and compliance to generate blood flow velocity, time-averaged WSS (TAWSS) and oscillatory shear index (OSI) results. These indices were quantified longitudinally and circumferentially in the dAo, and several visualization methods were used to highlight regions of potential hemodynamic susceptibility. Results The total dAo area exposed to subnormal TAWSS and OSI was similar between groups, but several statistically significant local differences were revealed. Control subjects experienced left-handed rotating patterns of TAWSS and OSI down the dAo. TAWSS was elevated in CoA patients near the site of residual narrowings and OSI was elevated distally, particularly along the left dAo wall. Differences in WSS indices between groups were negligible more than 5 dAo diameters distal to the aortic arch. Conclusions Localized differences in WSS indices within the dAo of CoA patients treated by RWEA suggest that plaque may form in unique locations influenced by the surgical repair. These regions can be visualized in familiar and intuitive ways allowing clinicians to track their contribution to morbidity in longitudinal studies. PMID:21801315

  15. Designing an End-to-End System for Data Storage, Analysis, and Visualization for an Urban Environmental Observatory

    NASA Astrophysics Data System (ADS)

    McGuire, M. P.; Welty, C.; Gangopadhyay, A.; Karabatis, G.; Chen, Z.

    2006-05-01

    The urban environment is formed by complex interactions between natural and human dominated systems, the study of which requires the collection and analysis of very large datasets that span many disciplines. Recent advances in sensor technology and automated data collection have improved the ability to monitor urban environmental systems and are making the idea of an urban environmental observatory a reality. This in turn has created a number of potential challenges in data management and analysis. We present the design of an end-to-end system to store, analyze, and visualize data from a prototype urban environmental observatory based at the Baltimore Ecosystem Study, a National Science Foundation Long Term Ecological Research site (BES LTER). We first present an object-relational design of an operational database to store high resolution spatial datasets as well as data from sensor networks, archived data from the BES LTER, data from external sources such as USGS NWIS, EPA Storet, and metadata. The second component of the system design includes a spatiotemporal data warehouse consisting of a data staging plan and a multidimensional data model designed for the spatiotemporal analysis of monitoring data. The system design also includes applications for multi-resolution exploratory data analysis, multi-resolution data mining, and spatiotemporal visualization based on the spatiotemporal data warehouse. Also the system design includes interfaces with water quality models such as HSPF, SWMM, and SWAT, and applications for real-time sensor network visualization, data discovery, data download, QA/QC, and backup and recovery, all of which are based on the operational database. The system design includes both internet and workstation-based interfaces. Finally we present the design of a laboratory for spatiotemporal analysis and visualization as well as real-time monitoring of the sensor network.

  16. End-to-end simulations and planning of a small space telescopes: Galaxy Evolution Spectroscopic Explorer: a case study

    NASA Astrophysics Data System (ADS)

    Heap, Sara; Folta, David; Gong, Qian; Howard, Joseph; Hull, Tony; Purves, Lloyd

    2016-08-01

    Large astronomical missions are usually general-purpose telescopes with a suite of instruments optimized for different wavelength regions, spectral resolutions, etc. Their end-to-end (E2E) simulations are typically photons-in to flux-out calculations made to verify that each instrument meets its performance specifications. In contrast, smaller space missions are usually single-purpose telescopes, and their E2E simulations start with the scientific question to be answered and end with an assessment of the effectiveness of the mission in answering the scientific question. Thus, E2E simulations for small missions consist a longer string of calculations than for large missions, as they include not only the telescope and instrumentation, but also the spacecraft, orbit, and external factors such as coordination with other telescopes. Here, we illustrate the strategy and organization of small-mission E2E simulations using the Galaxy Evolution Spectroscopic Explorer (GESE) as a case study. GESE is an Explorer/Probe-class space mission concept with the primary aim of understanding galaxy evolution. Operation of a small survey telescope in space like GESE is usually simpler than operations of large telescopes driven by the varied scientific programs of the observers or by transient events. Nevertheless, both types of telescopes share two common challenges: maximizing the integration time on target, while minimizing operation costs including communication costs and staffing on the ground. We show in the case of GESE how these challenges can be met through a custom orbit and a system design emphasizing simplification and leveraging information from ground-based telescopes.

  17. WE-G-BRD-08: End-To-End Targeting Accuracy of the Gamma Knife for Trigeminal Neuralgia

    SciTech Connect

    Brezovich, I; Wu, X; Duan, J; Benhabib, S; Huang, M; Shen, S; Cardan, R; Popple, R

    2014-06-15

    Purpose: Current QA procedures verify accuracy of individual equipment parameters, but may not include CT and MRI localizers. This study uses an end-to-end approach to measure the overall targeting errors in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 inch) diameter MRI contrast-filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the cavity position matches the Gamma Knife coordinates of 10 previously treated patients. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pin prick to identify the cavity center. Treatments are planned for delivery with 4 mm collimators using MRI and CT scans acquired with the clinical localizer boxes and acquisition protocols. Coordinates of shots are chosen so that the cavity is centered within the 50% isodose volume. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pin prick and the centroid of the 50% isodose line. Results: Averaged over 10 patient simulations, targeting errors along the x, y and z coordinates (patient left-to-right, posterior-anterior, head-to-foot) were, respectively, −0.060 +/− 0.363, −0.350 +/− 0.253, and 0.364 +/− 0.191 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely 0.109 +/− 0.167, −0.191 +/− 0.144, and 0.211 +/− 0.94 mm. The largest errors in MRI and CT planned treatments were, respectively, y = −0.761 and x = 0.428 mm. Conclusion: Unless patient motion or stronger MRI image distortion in actual treatments caused additional errors, all patients received the prescribed dose, i.e., the targeted section of the trig±eminal nerve was contained within the 50% isodose surface in all cases.

  18. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough

  19. Investigating end-to-end accuracy of image guided radiation treatment delivery using a micro-irradiator

    NASA Astrophysics Data System (ADS)

    Rankine, L. J.; Newton, J.; Bache, S. T.; Das, S. K.; Adamovics, J.; Kirsch, D. G.; Oldham, M.

    2013-11-01

    irradiator was verified to be within 0.5 mm (or 1.0 mm for the 5.0 mm cone) and the cone alignment was verified to be within 0.2 mm (or 0.4 mm for the 1.0 mm cone). The PRESAGE®/DMOS system proved valuable for end-to-end verification of small field IGRT capabilities.

  20. SBSS Demonstrator: A design for efficient demonstration of Space-based Space Surveillance end-to-end capabilities

    NASA Astrophysics Data System (ADS)

    Utzmann, Jens; Flohrer, Tim; Schildknecht, Thomas; Wagner, Axel; Silha, Jiri; Willemsen, Philip; Teston, Frederic

    This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. Space Surveillance and Tracking is part of Space Situational Awareness (SSA) and covers the detection, tracking and cataloguing of space debris and satellites. Derived SST services comprise a catalogue of these man-made objects, collision warning, detection and characterisation of in-orbit fragmentations, sub-catalogue debris characterisation, etc. The assessment of SBSS in a SST system architecture has shown that both an operational SBSS and also already a well-designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing and fusion, etc.) until the final products can be offered to the users. Also past and current missions by the US (SBV, SBSS) and Canada (Sapphire, NEOSSat) underline the advantages of space-based space surveillance. The presented SBSS system concept takes the ESA SST System Requirements (derived within the ESA SSA Preparatory Program) into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirments for detection and characterisation of small-sized LEO debris are

  1. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  2. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum.

    PubMed

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-04-21

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations.

  3. Reconstruction of Rain Microstructure From Spectrum of Scattering Light

    NASA Astrophysics Data System (ADS)

    Sterlyadkin, V.; Gluschenko, A.

    Night photoregistration of light, scattered by drops had proved that practically all drops oscillate as they fall. As drop oscillation frequency W monotony diminish with drop volume V rise, so different fractions of rain form different parts of spectrum. Thereby it is possible to reconstruct rain microstructure from remote optical measure- ments. In common case the form of spectrum depends not only on drop size distri- bution N(V) but also on oscillation amplitudes function, scattering phase function for oscillating drops and on frequency dependence W(V). The statistical treatment of our field data had shown that average oscillation amplitude rise with drop volume V as , where A is a some constant. This result allows to solve the inverse problem: to re- construct drop size distribution N(V) from the power spectrum of light, scattering by rain. Scattering phase function for nonspherical and oscillating drops was calculated in straight-line approximation. Analysis of optical properties of oscillating water drop had shown some optimal measurement geometry for registration of rain microstruc- ture. For low intensity rains it is reasonable to use the effect of abnormal high modu- lation of light scattered by oscillating drops, which we discovered earlier in laboratory condition and under field measurements. (The effect of abnormal high modulation al- lows us to detect 2-3 mm raindrop deformations from 5 m distance). The results of reconstruction of drop size distributions from spectra of light, scattered by rains are presented and discussed.

  4. Reconstruction of a Broadband Spectrum of Alfvenic Fluctuations

    NASA Technical Reports Server (NTRS)

    Vinas, Adolfo F.; Fuentes, Pablo S. M.; Araneda, Jaime A.; Maneva, Yana G.

    2014-01-01

    Alfvenic fluctuations in the solar wind exhibit a high degree of velocities and magnetic field correlations consistent with Alfven waves propagating away and toward the Sun. Two remarkable properties of these fluctuations are the tendencies to have either positive or negative magnetic helicity (-1 less than or equal to sigma(sub m) less than or equal to +1) associated with either left- or right- topological handedness of the fluctuations and to have a constant magnetic field magnitude. This paper provides, for the first time, a theoretical framework for reconstructing both the magnetic and velocity field fluctuations with a divergence-free magnetic field, with any specified power spectral index and normalized magnetic- and cross-helicity spectrum field fluctuations for any plasma species. The spectrum is constructed in the Fourier domain by imposing two conditions-a divergence-free magnetic field and the preservation of the sense of magnetic helicity in both spaces-as well as using Parseval's theorem for the conservation of energy between configuration and Fourier spaces. Applications to the one-dimensional spatial Alfvenic propagation are presented. The theoretical construction is in agreement with typical time series and power spectra properties observed in the solar wind. The theoretical ideas presented in this spectral reconstruction provide a foundation for more realistic simulations of plasma waves, solar wind turbulence, and the propagation of energetic particles in such fluctuating fields.

  5. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    NASA Astrophysics Data System (ADS)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  6. End-to-End System Test and Optical Performance Evaluation for the Solar and Heliosphere Observatory (SOHO) Ultraviolet Coronagraph Spectrometer (UVCS)

    NASA Technical Reports Server (NTRS)

    Carosso, Paolo A.; Gardner, Larry D.; Jhabvala, Marzy; Nicolosi, P.

    1997-01-01

    The UVCS is one of the instruments carried by the Solar and Heliospheric Observatory (SOHO), a joint NASA/ESA Spacecraft launched in November 1995. It is designed to perform ultraviolet spectroscopy and visible light polarimetry of the extended solar corona. The primary scientific objectives of the UVCS investigation are to study the physical processes occurring in the extended solar corona, such as: the mechanism of acceleration of the solar wind, the mechanism of coronal plasma heating, the identification of solar wind sources, and the investigation of the plasma properties of the solar wind. The UVCS End-to-End test activities included a comprehensive set of system level functional and optical tests. Although performed under severe schedule constraints, the End-to-End System Test was very successful and served to fully validate the UVCS optical design. All test results showed that the primary scientific objectives of the UVCS Mission were achievable.

  7. Modelling and simulation of the mechanical response of a Dacron graft in the pressurization test and an end-to-end anastomosis.

    PubMed

    Bustos, Claudio A; García-Herrera, Claudio M; Celentano, Diego J

    2016-08-01

    This work presents the modeling and simulation of the mechanical response of a Dacron graft in the pressurization test and its clinical application in the analysis of an end-to-end anastomosis. Both problems are studied via an anisotropic constitutive model that was calibrated by means of previously reported uniaxial tensile tests. First, the simulation of the pressurization test allows the validation of the experimental material characterization that included tests carried out for different levels of axial stretching. Then, the analysis of an end-to-end anastomosis under an idealized geometry is proposed. This case consists in evaluating the mechanical performance of the graft together with the stresses and deformations in the neighborhood of the Dacron with the artery. This research contributes important data to understand the functioning of the graft and the possibility of extending the analysis to complex numerical cases like its insertion in the aortic arch.

  8. End-to-end small bowel anastomosis by temperature controlled CO2 laser soldering and an albumin stent: a feasibility study

    NASA Astrophysics Data System (ADS)

    Simhon, David; Kopelman, Doron; Hashmonai, Moshe; Vasserman, Irena; Dror, Michael; Vasilyev, Tamar; Halpern, Marissa; Kariv, Naam; Katzir, Abraham

    2004-07-01

    Introduction: A feasibility study of small intestinal end to end anastomosis was performed in a rabbit model using temperature controlled CO2 laser system and an albumin stent. Compared with standard suturing or clipping, this method does not introduce foreign materials to the repaired wound and therefore, may lead to better and faster wound healing of the anastomotic site. Methods: Transected rabbits small intestines were either laser soldered using 47% bovine serum albumin and intraluminal albumin stent or served as controls in which conventional continuous two-layer end to end anastomosis was performed manually. The integrity of the anastomosis was investigated at the 14th postoperative day. Results: Postoperative course in both treatments was uneventful. The sutured group presented signs of partial bowel obstruction. Macroscopically, no signs of intraluminal fluid leakage were observed in both treatments. Yet, laser soldered intestinal anastomoses demonstrated significant superiority with respect to adhesions and narrowing of the intestinal lumen. Serial histological examinations revealed better wound healing characteristics of the laser soldered anastomotic site. Conclusion: Laser soldering of intestinal end to end anastomosis provide a faster surgical procedure, compared to standard suture technique, with better wound healing results. It is expected that this technique may be adopted in the future for minimal invasive surgeries.

  9. Reconstructing the primordial power spectrum from the CMB

    NASA Astrophysics Data System (ADS)

    Gauthier, Christopher; Bucher, Martin

    2012-10-01

    We propose a straightforward and model independent methodology for characterizing the sensitivity of CMB and other experiments to wiggles, irregularities, and features in the primordial power spectrum. Assuming that the primordial cosmological perturbations are adiabatic, we present a function space generalization of the usual Fisher matrix formalism applied to a CMB experiment resembling Planck with and without ancillary data. This work is closely related to other work on recovering the inflationary potential and exploring specific models of non-minimal, or perhaps baroque, primordial power spectra. The approach adopted here, however, most directly expresses what the data is really telling us. We explore in detail the structure of the available information and quantify exactly what features can be reconstructed and at what statistical significance.

  10. Planning for Mars Sample Return: Results from the MEPAG Mars Sample Return End-to-End International Science Analysis Group (E2E-iSAG)

    NASA Astrophysics Data System (ADS)

    McLennan, S. M.; Sephton, M.; Mepag E2E-Isag

    2011-12-01

    The National Research Council 2011 Planetary Decadal Survey (2013-2022) placed beginning a Mars sample return campaign (MSR) as the top priority for large Flagship missions in the coming decade. Recent developments in NASA-ESA collaborations and Decadal Survey recommendations indicate MSR likely will be an international effort. A joint ESA-NASA 2018 rover (combining the previously proposed ExoMars and MAX-C missions), designed, in part, to collect and cache samples, would thus represent the first of a 3-mission MSR campaign. The End-to-End International Science Analysis Group (E2E-iSAG) was chartered by MEPAG in August 2010 to develop and prioritize MSR science objectives and investigate implications of these objectives for defining the highest priority sample types, landing site selection criteria (and identification of reference landing sites to support engineering planning), requirements for in situ characterization on Mars to support sample selection, and priorities/strategies for returned sample analyses to determine sample sizes and numbers that would meet the objectives. MEPAG approved the E2E-iSAG report in June 2011. Science objectives, summarized in priority order, are: (1) critically assess any evidence for past life or its chemical precursors, and place constraints on past habitability and potential for preservation of signs of life, (2) quantitatively constrain age, context and processes of accretion, early differentiation and magmatic and magnetic history, (3) reconstruct history of surface and near-surface processes involving water, (4) constrain magnitude, nature, timing, and origin of past climate change, (5) assess potential environmental hazards to future human exploration, (6) assess history and significance of surface modifying processes, (7) constrain origin and evolution of the Martian atmosphere, (8) evaluate potential critical resources for future human explorers. All returned samples also would be fully evaluated for extant life as a

  11. An end-to-end system in support of a broad scope of GOES-R sensor and data processing study

    NASA Astrophysics Data System (ADS)

    Huang, Hung-Lung

    2005-08-01

    The mission of NOAA's Geostationary Operational Environmental Satellite System (GOES) R series satellites, in the 2012 time frame, is to provide continuous, near real-time meteorological, oceanographic, solar, and space environment data that supports NOAA's strategic mission goals. It presents an exciting opportunity to explore new instruments, satellite designs, and system architectures utilizing new communication and instrument technologies in order to meet the ever-increasing demands made of Earth observation systems by national agencies and end users alike. The GOES-R sensor suite includes a 16 spectral band Advanced Baseline Imager (ABI), an approximately 1500 high spectral resolution band Hyperspectral Environmental Suite (HES), plus other sensors designed to detect lightning and to explore the ocean, solar and space environment. The Cooperative Institute for Meteorological Satellite Studies (CIMSS) as part of the Space Science and Engineering Center (SSEC) of the University of Wisconsin-Madison, the long time partner of NOAA, has developed the first operational end-to-end processing system for GOES. Based on this heritage, and with recent support from the NASA/NOAA Geosynchrous Imaging FTS (GIFTS) project, the Navy's Multiple University Research Initiative (MURI), and NOAA's GOES-R Risk Reduction program, SSEC has built a near-complete end-to-end system that is capable of simulating sensor measurements from top of atmosphere radiances, raw sensor data (level 0) through calibrated and navigated sensor physical measurements (level 1) to the processed products (level 2). In this paper, the SSEC Hyperspectral Imaging and Sounding Simulator and Processor (HISSP) will be presented in detail. HISSP is capable of demonstrating most of the processing functions such as data compression/decompression, sensor calibration, data processing, algorithm development, and product generation. In summary, HISSP is an end-to-end system designed to support both government and

  12. Complications of rectal anastomoses with end-to-end anastomosis (EEA) stapling instrument. Clinical and radiological leak rates and some practical hints.

    PubMed Central

    Dorricott, N. J.; Baddeley, R. M.; Keighley, M. R.; Lee, J.; Oates, G. D.; Alexander-Williams, J.

    1982-01-01

    The complications and results of rectal anastomoses carried out with the end-to-end anastomosis (EEA) stapling instrument on 50 patients by 5 consultant surgeons are recorded. There was a clinical leakage rate of 6% and a radiological leakage rate of 20% assessed by water-soluble contrast enema. The technique has advantages compared with hand-suture by allowing low anastomoses and preservation of sphincters and is accompanied by an acceptably low leakage rate. Despite the cost of disposable cartridges these advantages make the technique economical because of the avoidance of colostomies and reduction in hospital stay. Images FIG. 1 FIG. 2 PMID:7044253

  13. The Gamma-ray Cherenkov Telescope, an end-to end Schwarzschild-Couder telescope prototype proposed for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Dournaux, J. L.; Abchiche, A.; Allan, D.; Amans, J. P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Dangeon, L.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dumas, D.; Ernenwein, J. P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hameau, B.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J. M.; Jégouzo, I.; Jogler, T.; Kawashima, T.; Kraush, M.; Lapington, J. S.; Laporte, P.; Lefaucheur, J.; Markoff, S.; Melse, T.; Mohrmann, L.; Molyneux, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayède, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.

    2016-08-01

    The GCT (Gamma-ray Cherenkov Telescope) is a dual-mirror prototype of Small-Sized-Telescopes proposed for the Cherenkov Telescope Array (CTA) and made by an Australian-Dutch-French-German-Indian-Japanese-UK-US consortium. The integration of this end-to-end telescope was achieved in 2015. On-site tests and measurements of the first Cherenkov images on the night sky began on November 2015. This contribution describes the telescope and plans for the pre-production and a large scale production within CTA.

  14. On-Orbit Performance Verification and End-to-End Characterization of the TDRS-H Ka-Band Communications Payload

    NASA Technical Reports Server (NTRS)

    Toral, Marco; Wesdock, John; Kassa, Abby; Pogorelc, Patsy; Jenkens, Robert (Technical Monitor)

    2002-01-01

    In June 2000, NASA launched the first of three next generation Tracking and Data Relay Satellites (TDRS-H) equipped with a Ka-band forward and return service capability. This Ka-band service supports forward data rates up to 25 Mb/sec using the 22.55 - 23.55 GHz space-to-space allocation. Return services are supported via channel bandwidths of 225 and 650 MHz for data rates up to 800 Mb/sec (QPSK) using the 25.25 - 27.5 GHz space-to-space allocation. As part of NASA's acceptance of the TDRS-H spacecraft, an extensive on-orbit calibration, verification and characterization effort was performed to ensure that on-orbit spacecraft performance is within specified limits. This process verified the compliance of the Ka-band communications payload with all performance specifications and demonstrated an end-to-end Ka-band service capability. This paper summarizes the results of the TDRS-H Ka-band communications payload on-orbit performance verification and end-to-end service characterization. Performance parameters addressed include Effective Isotropically Radiated Power (EIRP), antenna Gain-to-System Noise Temperature (G/T), antenna gain pattern, frequency tunability and accuracy, channel magnitude response, and Ka-band service Bit-Error-Rate (BER) performance.

  15. Comparison of Peripheral Nerve Regeneration with Side-to-side, End-to-side, and End-to-end Repairs: An Experimental Study

    PubMed Central

    Göransson, Harry; Taskinen, Hanna-Stiina; Paavilainen, Pasi; Vahlberg, Tero; Röyttä, Matias

    2016-01-01

    Background: The present study was conducted to find out a tool to enable improved functional recovery with proximal nerve injury. In this experimental study, nerve regeneration was compared between side-to-side (STS), end-to-side (ETS), and end-to-end repairs. Methods: The walk track analysis was used as an outcome of functional recovery. Nerve regeneration was studied with morphometry and histology 6 or 26 weeks postoperatively. Results: All 3 repair techniques showed regeneration of the nerve. From 12 weeks onward, the functional results of the 3 intervention groups were significantly better compared with the unrepaired control group. End-to-end repair was significantly better when compared with the STS and ETS groups. At 26 weeks, the functional and morphometric results and histologic findings did not differ between the STS and ETS groups. The functional results correlated with the morphometric findings in all groups. Conclusions: STS neurorrhaphy showed nerve regeneration, and the end results did not differ from clinically widely used ETS repair. Further studies are warranted to optimize the neurorrhaphy technique and examine possible applications of STS repair in peripheral nerve surgery. PMID:28293523

  16. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    NASA Astrophysics Data System (ADS)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  17. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Ionospheric Cusp and Resulting Ion Outflow to the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Coffey, Victoria; Chandler, Michael; Singh, Nagendra; Avanov, Levon

    2003-01-01

    We will show results from an end-to-end study of the energy transfer from injected magnetosheath plasmas to the near-Earth magnetospheric and ionospheric plasmas and the resulting ion outflow to the magnetosphere. This study includes modeling of the evolution of the magnetosheath precipitation in the cusp using a kinetic code with a realistic magnetic field configuration. These evolved, highly non-Maxwellian distributions are used as input to a 2D PIC code to analyze the resulting wave generation. The wave analysis is used in the kinetic code as input to the cold ionospheric ions to study the transfer of energy to these ions and their outflow to the magnetosphere. Observations from the Thermal Ion Dynamics Experiment (TIDE) and other instruments on the Polar Spacecraft will be compared to the modeling.

  18. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    NASA Astrophysics Data System (ADS)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  19. The role of environmental controls in determining sardine and anchovy population cycles in the California Current: Analysis of an end-to-end model

    NASA Astrophysics Data System (ADS)

    Fiechter, Jerome; Rose, Kenneth A.; Curchitser, Enrique N.; Hedstrom, Katherine S.

    2015-11-01

    Sardine and anchovy are two forage species of particular interest because of their low-frequency cycles in adult abundance in boundary current regions, combined with a commercially relevant contribution to the global marine food catch. While several hypotheses have been put forth to explain decadal shifts in sardine and anchovy populations, a mechanistic basis for how the physics, biogeochemistry, and biology combine to produce patterns of synchronous variability across widely separated systems has remained elusive. The present study uses a 50-year (1959-2008) simulation of a fully coupled end-to-end ecosystem model configured for sardine and anchovy in the California Current System to investigate how environmental processes control their population dynamics. The results illustrate that slightly different temperature and diet preferences can lead to significantly different responses to environmental variability. Simulated adult population fluctuations are associated with age-1 growth (via age-2 egg production) and prey availability for anchovy, while they depend primarily on age-0 survival and temperature for sardine. The analysis also hints at potential linkages to known modes of climate variability, whereby changes in adult abundance are related to ENSO for anchovy and to the PDO for sardine. The connection to the PDO and ENSO is consistent with modes of interannual and decadal variability that would alternatively favor anchovy during years of cooler temperatures and higher prey availability, and sardine during years of warmer temperatures and lower prey availability. While the end-to-end ecosystem model provides valuable insight on potential relationships between environmental conditions and sardine and anchovy population dynamics, understanding the complex interplay, and potential lags, between the full array of processes controlling their abundances in the California Current System remains an on-going challenge.

  20. A statistical iterative reconstruction framework for dual energy computed tomography without knowing tube spectrum

    NASA Astrophysics Data System (ADS)

    Chang, Shaojie; Mou, Xuanqin

    2016-09-01

    Dual energy computed tomography (DECT) has significant impacts on material characterization, bone mineral density inspection, nondestructive evaluation and so on. In spite of great progress has been made recently on reconstruction algorithms for DECT, there still exist two main problems: 1) For polyenergetic X-ray source, the tube spectrum needed in reconstruction is not always available. 2) The reconstructed image of DECT is very sensitive to noise which demands special noise suppression strategy in reconstruction algorithm design. In this paper, we propose a novel method for DECT reconstruction that reconstructs tube spectrum from projection data and suppresses image noise by introducing l1-norm based regularization into statistical reconstruction for polychromatic DECT. The contribution of this work is twofold. 1) A three parameters model is devised to represent spectrum of ployenergetic X-ray source. And the parameters can be estimated from projection data by solving an optimization problem. 2) With the estimated tube spectrum, we propose a computation framework of l1-norm regularization based statistical iterative reconstruction for polychromatic DECT. Simulation experiments with two phantoms were conducted to evaluate the proposed method. Experimental results demonstrate the accuracy and robustness of the spectrum model in terms of that comparable reconstruction image quality can be achieved with the estimated and ideal spectrum, and validate that the proposed method works with attractive performance in terms of accuracy of reconstructed image. The root mean square error (RMSE) between the reconstructed image and the ground truth image are 7.648 × 10-4 and 2.687 x 10-4 for the two phantoms, respectively.

  1. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    NASA Astrophysics Data System (ADS)

    Bowen, S. R.; Nyflot, M. J.; Herrmann, C.; Groh, C. M.; Meyer, J.; Wollenweber, S. D.; Stearns, C. W.; Kinahan, P. E.; Sandison, G. A.

    2015-05-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT

  2. Reconstruction of the electron spectrum in a metal hydrogen sulfide

    NASA Astrophysics Data System (ADS)

    Kudryashov, N. A.; Kutukov, A. A.; Mazur, E. A.

    2017-01-01

    Generalized Eliashberg theory of the normal properties of a metal electron-phonon system with a non constant electron density of states has been used to study the effect of the conduction band reconstruction. The electron density of states of the metallic phase of the hydrogen sulfide renormalized by the strong electron-phonon coupling at a pressure of P = 225 GPa has been calculated. It has been found that the reconstructed conduction band contains a series of narrow energy pockets.

  3. Overview of Non-nuclear Testing of the Safe, Affordable 30-kW Fission Engine, Including End-to-End Demonstrator Testing

    NASA Technical Reports Server (NTRS)

    VanDyke, M. K.; Martin, J. J.; Houts, M. G.

    2003-01-01

    Successful development of space fission systems will require an extensive program of affordable and realistic testing. In addition to tests related to design/development of the fission system, realistic testing of the actual flight unit must also be performed. At the power levels under consideration (3-300 kW electric power), almost all technical issues are thermal or stress related and will not be strongly affected by the radiation environment. These issues can be resolved more thoroughly, less expensively, and in a more timely fashing with nonnuclear testing, provided it is prototypic of the system in question. This approach was used for the safe, affordable fission engine test article development program and accomplished viz cooperative efforts with Department of Energy labs, industry, universiites, and other NASA centers. This Technical Memorandum covers the analysis, testing, and data reduction of a 30-kW simulated reactor as well as an end-to-end demonstrator, including a power conversion system and an electric propulsion engine, the first of its kind in the United States.

  4. Land Mobile Satellite Service (LMSS) channel simulator: An end-to-end hardware simulation and study of the LMSS communications links

    NASA Technical Reports Server (NTRS)

    Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.

    1984-01-01

    The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.

  5. An End-to-End Trainable Neural Network for Image-based Sequence Recognition and Its Application to Scene Text Recognition.

    PubMed

    Shi, Baoguang; Bai, Xiang; Yao, Cong

    2016-12-29

    Image-based sequence recognition has been a long-standing research topic in computer vision. In this paper, we investigate the problem of scene text recognition, which is among the most important and challenging tasks in image-based sequence recognition. A novel neural network architecture, which integrates feature extraction, sequence modeling and transcription into a unified framework, is proposed. Compared with previous systems for scene text recognition, the proposed architecture possesses four distinctive properties: (1) It is end-to-end trainable, in contrast to most of the existing algorithms whose components are separately trained and tuned. (2) It naturally handles sequences in arbitrary lengths, involving no character segmentation or horizontal scale normalization. (3) It is not confined to any predefined lexicon and achieves remarkable performances in both lexicon-free and lexicon-based scene text recognition tasks. (4) It generates an effective yet much smaller model, which is more practical for real-world application scenarios. The experiments on standard benchmarks, including the IIIT-5K, Street View Text and ICDAR datasets, demonstrate the superiority of the proposed algorithm over the prior arts. Moreover, the proposed algorithm performs well in the task of image-based music score recognition, which evidently verifies the generality of it.

  6. End-to-end simulation of a K-band LEO-LEO satellite link for estimating water vapor in the low troposphere

    NASA Astrophysics Data System (ADS)

    Facheris, Luca; Cuccoli, Fabrizio; Argenti, Fabrizio

    2004-11-01

    A new differential measurement concept is presented for retrieving the total content of water vapor (IWV, Integrated Water Vapor) along the propagation path between two Low Earth Orbiting (LEO) satellites, while such path is immersing in the atmosphere during a so called set occultation. The new approach, referred to as DSA (Differential Spectral Absorption) method, is based on the simultaneous measurement of the total attenuation at two relatively close frequencies in the K band, and on the estimate of a "spectral sensitivity parameter" that is highly correlated to the IWV content of the LEO-LEO link in the low troposphere. The DSA approach has the potential to overcome all spectrally 'flat' and spectrally correlated phenomena (atmospheric scintillation among these) and provides estimates that can then be usefully integrated with standard radio occultation data products. In the paper we describe the signaling structure chosen for DSA measurements and the transmit-receive system used to simulate an end-to-end transmission during a complete LEO-LEO set occultation. Simulations are based on atmospheric models and on real radiosonde data, which allows us to account for the natural variability of the atmospheric conditions. The effects on the IWV estimates of impairments such as thermal noise at the receiver, atmospheric scintillation, multipath and defocusing are evaluated.

  7. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  8. Shared path protection through reconstructing sharable bandwidth based on spectrum segmentation for elastic optical networks

    NASA Astrophysics Data System (ADS)

    Liu, Huanlin; Zhang, Mingjia; Yi, Pengfei; Chen, Yong

    2016-12-01

    In order to address the problems of spectrum fragmentation and low sharing degree of spectrum resources in survivable elastic optical networks, an improved algorithm, called shared path protection by reconstructing sharable bandwidth based on spectrum segmentation (SPP-RSB-SS), is proposed in the paper. In the SPP-RSB-SS algorithm, for reducing the number of spectrum fragmentations and improving the success rate of spectrum allocation, the whole spectrum resource is partitioned into several spectrum segments. And each spectrum segment is allocated to the requests with the same bandwidth requirement in priority. Meanwhile, the protection path with higher spectrum sharing degree is selected through optimizing the link cost function and reconstructing sharable bandwidth. Hence, the protection path can maximize the sharable spectrum usage among multiple protection paths. The simulation results indicate that the SPP-RSB-SS algorithm can increase the sharing degree of protection spectrum effectively. Furthermore, the SPP-RSB-SS algorithm can enhance the spectrum utilization, and reduce the bandwidth blocking probability significantly.

  9. Assessing the value of seasonal climate forecast information through an end-to-end forecasting framework: Application to U.S. 2012 drought in central Illinois

    NASA Astrophysics Data System (ADS)

    Shafiee-Jood, Majid; Cai, Ximing; Chen, Ligang; Liang, Xin-Zhong; Kumar, Praveen

    2014-08-01

    This study proposes an end-to-end forecasting framework to incorporate operational seasonal climate forecasts to help farmers improve their decisions prior to the crop growth season, which are vulnerable to unanticipated drought conditions. The framework couples a crop growth model with a decision-making model for rainfed agriculture and translates probabilistic seasonal forecasts into more user-related information that can be used to support farmers' decisions on crop type and some market choices (e.g., contracts with ethanol refinery). The regional Climate-Weather Research and Forecasting model (CWRF) driven by two operational general circulation models (GCMs) is used to provide the seasonal forecasts of weather parameters. To better assess the developed framework, CWRF is also driven by observational reanalysis data, which theoretically can be considered as the best seasonal forecast. The proposed framework is applied to the Salt Creek watershed in Illinois that experienced an extreme drought event during 2012 crop growth season. The results show that the forecasts cannot capture the 2012 drought condition in Salt Creek and therefore the suggested decisions can make farmers worse off if the suggestions are adopted. Alternatively, the optimal decisions based on reanalysis-based CWRF forecasts, which can capture the 2012 drought conditions, make farmers better off by suggesting "no-contract" with ethanol refineries. This study suggests that the conventional metric used for ex ante value assessment is not capable of providing meaningful information in the case of extreme drought. Also, it is observed that institutional interventions (e.g., crop insurance) highly influences farmers' decisions and, thereby, the assessment of forecast value.

  10. Design of a satellite end-to-end mission performance simulator for imaging spectrometers and its application to the ESA's FLEX/Sentinel-3 tandem mission

    NASA Astrophysics Data System (ADS)

    Vicent, Jorge; Sabater, Neus; Tenjo, Carolina; Acarreta, Juan R.; Manzano, María.; Rivera, Juan P.; Jurado, Pedro; Franco, Raffaella; Alonso, Luis; Moreno, Jose

    2015-09-01

    The performance analysis of a satellite mission requires specific tools that can simulate the behavior of the platform; its payload; and the acquisition of scientific data from synthetic scenes. These software tools, called End-to-End Mission Performance Simulators (E2ES), are promoted by the European Space Agency (ESA) with the goal of consolidating the instrument and mission requirements as well as optimizing the implemented data processing algorithms. Nevertheless, most developed E2ES are designed for a specific satellite mission and can hardly be adapted to other satellite missions. In the frame of ESA's FLEX mission activities, an E2ES is being developed based on a generic architecture for passive optical missions. FLEX E2ES implements a state-of-the-art synthetic scene generator that is coupled with dedicated algorithms that model the platform and instrument characteristics. This work will describe the flexibility of the FLEX E2ES to simulate complex synthetic scenes with a variety of land cover classes, topography and cloud cover that are observed separately by each instrument (FLORIS, OLCI and SLSTR). The implemented algorithms allows modelling the sensor behavior, i.e. the spectral/spatial resampling of the input scene; the geometry of acquisition; the sensor noises and non-uniformity effects (e.g. stray-light, spectral smile and radiometric noise); and the full retrieval scheme up to Level-2 products. It is expected that the design methodology implemented in FLEX E2ES can be used as baseline for other imaging spectrometer missions and will be further expanded towards a generic E2ES software tool.

  11. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    PubMed

    Finak, Greg; Frelinger, Jacob; Jiang, Wenxin; Newell, Evan W; Ramey, John; Davis, Mark M; Kalams, Spyros A; De Rosa, Stephen C; Gottardo, Raphael

    2014-08-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis framework designed to lower the barrier of entry to automated flow data analysis algorithms by addressing key areas that we believe have held back wider adoption of automated approaches. OpenCyto supports end-to-end data analysis that is robust and reproducible while generating results that are easy to interpret. We have improved the existing, widely used core BioConductor flow cytometry infrastructure by allowing analysis to scale in a memory efficient manner to the large flow data sets that arise in clinical trials, and integrating domain-specific knowledge as part of the pipeline through the hierarchical relationships among cell populations. Pipelines are defined through a text-based csv file, limiting the need to write data-specific code, and are data agnostic to simplify repetitive analysis for core facilities. We demonstrate how to analyze two large cytometry data sets: an intracellular cytokine staining (ICS) data set from a published HIV vaccine trial focused on detecting rare, antigen-specific T-cell populations, where we identify a new subset of CD8 T-cells with a vaccine-regimen specific response that could not be identified through manual analysis, and a CyTOF T-cell phenotyping data set where a large staining panel and many cell populations are a challenge for traditional analysis. The substantial improvements to the core BioConductor flow cytometry packages give OpenCyto the potential for wide adoption. It can rapidly leverage new developments in computational

  12. Combined fishing and climate forcing in the southern Benguela upwelling ecosystem: an end-to-end modelling approach reveals dampened effects.

    PubMed

    Travers-Trolet, Morgane; Shin, Yunne-Jai; Shannon, Lynne J; Moloney, Coleen L; Field, John G

    2014-01-01

    The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE). Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects of fishing and

  13. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    NASA Astrophysics Data System (ADS)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  14. Endogenous amino nitrogen collected from pigs with end-to-end ileorectal anastomosis is affected by the method of estimation and altered by dietary fiber.

    PubMed

    Mariscal-Landín, G; Sève, B; Colléaux, Y; Lebreton, Y

    1995-01-01

    Endogenous protein loss at the end of the small intestine was determined in two experiments using 10 pigs surgically prepared with end-to-end ileo-rectal anastomosis to allow total collection of ileal digesta. In the first experiment pigs were fed graded protein levels of 0 (protein-free), 55, 110 or 165 g/kg diet. Optimal durations for the adaptation and collection periods were found to be 4 and 3 d, respectively (combination 4:3), as shown by the higher correlation coefficient (r2 = 0.95) between excreted and ingested nitrogen compared with the other combinations tested (5:2, 5:3, 9:3, 9:5). The estimated amounts of endogenous N and amino acids were less accurate and tended to be smaller (P < 0.20) when obtained by extrapolation to zero nitrogen intake than when measured in pigs fed the protein-free diet. The endogenous protein was rich in proline, glutamic acid, glycine, aspartic acid, serine and threonine. In comparison to other amino acid patterns, this composition suggested a low bacterial contamination of the digesta. In the second experiment three levels of dietary fiber from wheat straw, corn cobs and wood cellulose were studied in pigs fed protein-free diets. Between 17 and 34 g crude fiber/kg diet, fiber increased the endogenous losses of nitrogen and amino acids per kilogram of dry matter intake (P < 0.05), but the excretion reached a plateau at higher dietary fiber concentration (102 g/kg). In contrast, glucosamine and galactosamine excretion increased continuously and linearly (P < 0.05) with fiber intake. We conclude that endogenous amino acid loss may be considered constant at usual and high levels of the fibrous mixture under study.

  15. Combined Fishing and Climate Forcing in the Southern Benguela Upwelling Ecosystem: An End-to-End Modelling Approach Reveals Dampened Effects

    PubMed Central

    Travers-Trolet, Morgane; Shin, Yunne-Jai; Shannon, Lynne J.; Moloney, Coleen L.; Field, John G.

    2014-01-01

    The effects of climate and fishing on marine ecosystems have usually been studied separately, but their interactions make ecosystem dynamics difficult to understand and predict. Of particular interest to management, the potential synergism or antagonism between fishing pressure and climate forcing is analysed in this paper, using an end-to-end ecosystem model of the southern Benguela ecosystem, built from coupling hydrodynamic, biogeochemical and multispecies fish models (ROMS-N2P2Z2D2-OSMOSE). Scenarios of different intensities of upwelling-favourable wind stress combined with scenarios of fishing top-predator fish were tested. Analyses of isolated drivers show that the bottom-up effect of the climate forcing propagates up the food chain whereas the top-down effect of fishing cascades down to zooplankton in unfavourable environmental conditions but dampens before it reaches phytoplankton. When considering both climate and fishing drivers together, it appears that top-down control dominates the link between top-predator fish and forage fish, whereas interactions between the lower trophic levels are dominated by bottom-up control. The forage fish functional group appears to be a central component of this ecosystem, being the meeting point of two opposite trophic controls. The set of combined scenarios shows that fishing pressure and upwelling-favourable wind stress have mostly dampened effects on fish populations, compared to predictions from the separate effects of the stressors. Dampened effects result in biomass accumulation at the top predator fish level but a depletion of biomass at the forage fish level. This should draw our attention to the evolution of this functional group, which appears as both structurally important in the trophic functioning of the ecosystem, and very sensitive to climate and fishing pressures. In particular, diagnoses considering fishing pressure only might be more optimistic than those that consider combined effects of fishing and

  16. Spectrum reconstruction method based on the detector response model calibrated by x-ray fluorescence.

    PubMed

    Li, Ruizhe; Li, Liang; Chen, Zhiqiang

    2017-02-07

    Accurate estimation of distortion-free spectra is important but difficult in various applications, especially for spectral computed tomography. Two key problems must be solved to reconstruct the incident spectrum. One is the acquisition of the detector energy response. It can be calculated by Monte Carlo simulation, which requires detailed modeling of the detector system and a high computational power. It can also be acquired by establishing a parametric response model and be calibrated using monochromatic x-ray sources, such as synchrotron sources or radioactive isotopes. However, these monochromatic sources are difficult to obtain. Inspired by x-ray fluorescence (XRF) spectrum modeling, we propose a feasible method to obtain the detector energy response based on an optimized parametric model for CdZnTe or CdTe detectors. The other key problem is the reconstruction of the incident spectrum with the detector response. Directly obtaining an accurate solution from noisy data is difficult because the reconstruction problem is severely ill-posed. Different from the existing spectrum stripping method, a maximum likelihood-expectation maximization iterative algorithm is developed based on the Poisson noise model of the system. Simulation and experiment results show that our method is effective for spectrum reconstruction and markedly increases the accuracy of XRF spectra compared with the spectrum stripping method. The applicability of the proposed method is discussed, and promising results are presented.

  17. Spectrum reconstruction method based on the detector response model calibrated by x-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Li, Ruizhe; Li, Liang; Chen, Zhiqiang

    2017-02-01

    Accurate estimation of distortion-free spectra is important but difficult in various applications, especially for spectral computed tomography. Two key problems must be solved to reconstruct the incident spectrum. One is the acquisition of the detector energy response. It can be calculated by Monte Carlo simulation, which requires detailed modeling of the detector system and a high computational power. It can also be acquired by establishing a parametric response model and be calibrated using monochromatic x-ray sources, such as synchrotron sources or radioactive isotopes. However, these monochromatic sources are difficult to obtain. Inspired by x-ray fluorescence (XRF) spectrum modeling, we propose a feasible method to obtain the detector energy response based on an optimized parametric model for CdZnTe or CdTe detectors. The other key problem is the reconstruction of the incident spectrum with the detector response. Directly obtaining an accurate solution from noisy data is difficult because the reconstruction problem is severely ill-posed. Different from the existing spectrum stripping method, a maximum likelihood-expectation maximization iterative algorithm is developed based on the Poisson noise model of the system. Simulation and experiment results show that our method is effective for spectrum reconstruction and markedly increases the accuracy of XRF spectra compared with the spectrum stripping method. The applicability of the proposed method is discussed, and promising results are presented.

  18. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  19. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    SciTech Connect

    Lee, S; Nguyen, N; Liu, F; Huang, Y; Jung, J; Pyakuryal, A; Jang, S

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  20. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-01-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is meanwhile widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionopheric error (RIE) can still be significant so that it needs to be further mitigated for high accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important towards enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and SDs, with solar activity, latitudinal region, and with or without the assumption of ionospheric spherical symmetry and of co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low- to high-latitudes, have a clear negative tendency and a magnitude increasing with solar activity, in line with recent empirical studies based on real RO data. The maximum RIE biases are found at low latitudes during daytime, where they amount to with in -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions). Ionospheric spherical symmetry or asymmetries about the RO event location have only a minor influence on

  1. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-07-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is currently widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes, and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionospheric error (RIE) can still be significant so that it needs to be further mitigated for high-accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important for enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and standard deviations, with solar activity, latitudinal region and with or without the assumption of ionospheric spherical symmetry and co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low to high latitudes, have a clear negative tendency and a magnitude increasing with solar activity, which is in line with recent empirical studies based on real RO data although we find smaller bias magnitudes, deserving further study in the future. The maximum RIE biases are found at low latitudes during daytime, where they amount to within -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions

  2. SU-E-T-360: End-To-End Dosimetric Testing of a Versa HD Linear Accelerator with the Agility Head Modeled in Pinnacle3

    SciTech Connect

    Saenz, D; Narayanasamy, G; Cruz, W; Papanikolaou, N; Stathakis, S

    2015-06-15

    Purpose: The Versa HD incorporates a variety of upgrades, primarily including the Agility head. The distinct dosimetric properties of the head from its predecessors combined with flattening-filter-free (FFF) beams require a new investigation of modeling in planning systems and verification of modeling accuracy. Methods: A model was created in Pinnacle{sup 3} v9.8 with commissioned beam data. Leaf transmission was modeled as <0.5% with maximum leaf speed of 3 cm/s. Photon spectra were tuned for FFF beams, for which profiles were modeled with arbitrary profiles rather than with cones. For verification, a variety of plans with varied parameters were devised, and point dose measurements were compared to calculated values. A phantom of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle{sup 3}. Beams of different field sizes, SSD, wedges, and gantry angles were created. All available photon energies (6 MV, 10 MV, 18 MV, 6 FFF, 10 FFF) as well four clinical electron energies (6, 9, 12, and 15 MeV) were investigated. The plans were verified at a calculation point (8 cm deep for photons, variable for electrons) by measurement with a PTW Semiflex ionization chamber. In addition, IMRT testing was performed with three standard plans (step and shoot IMRT, small and large field VMAT plans). The plans were delivered on the Delta4 IMRT QA phantom (ScandiDos, Uppsala, Sweden). Results: Homogeneous point dose measurement agreed within 2% for all photon and electron beams. Open field photon measurements along the central axis at 100 cm SSD passed within 1%. Gamma passing rates were >99.5% for all plans with a 3%/3mm tolerance criteria. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4±2.3%. Conclusion: The end-to-end testing ensured confidence in the ability of Pinnacle{sup 3} to model photon and electron beams with the Agility head.

  3. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    SciTech Connect

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  4. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  5. Micro-ARES, an electric-field sensor for ExoMars 2016: Electric fields modelling, sensitivity evaluations and end-to-end tests.

    NASA Astrophysics Data System (ADS)

    Déprez, Grégoire; Montmessin, Franck; Witasse, Olivier; Lapauw, Laurent; Vivat, Francis; Abbaki, Sadok; Granier, Philippe; Moirin, David; Trautner, Roland; Hassen-Khodja, Rafik; d'Almeida, Éric; Chardenal, Laurent; Berthelier, Jean-Jacques; Esposito, Francesca; Debei, Stefano; Rafkin, Scott; Barth, Erika

    2014-05-01

    Earth and transposed to the Martian atmospheric parameters. Knowing the expected electric fields and simulating them, the next step in order to evaluate the performance of the instrument is to determine its sensitivity by modelling the response of the instrument. The last step is to confront the model of the instrument, and the expected results for a given signal with the effective outputs of the electric board with the same signal as an input. To achieve this end-to-end test, we use a signal generator followed by an electrical circuit reproducing the electrode behaviour in the Martian environment, in order to inject a realistic electric signal in the processing board and finally compare the produced formatted data with the expected ones.

  6. Spectrum Reconstruction of a Spatially Modulated Fourier Transform Spectrometer Based on Stepped Mirrors.

    PubMed

    Gao, Jianhua; Liang, Zhongzhu; Liang, Jingqiu; Wang, Weibiao; Lü, Jinguang; Qin, Yuxin

    2016-11-23

    Based on the basic configuration and interference principle of a static step-mirror-based Fourier transform spectrometer, an image segmentation method is proposed to obtain a one-dimensional interferogram. The direct current component of the interferogram is fit using the least squares (LS) method and is subsequently removed. An empirical-mode decomposition-method-based high-pass filter is constructed to denoise the spectrum and enhance the spectral resolution simultaneously. Several experiments were performed and the spectrum is reconstructed based on these methods. The spectrum resolution is 81 cm(-1) at 2254 cm(-1).

  7. Noise correction in power spectrum and image reconstruction with speckle holography.

    NASA Astrophysics Data System (ADS)

    Qiu, Yao-Hui; Zhang, Rui-Long; Lou, Ke; Lu, Ru-Wei; Liu, Zhong

    At first, the principle of speckle holography is introduced briefly, then some influences upon power spectrum arised from the noise in real data is analysed and a method to correct the noise bias is discussed. At last, a high resolution image reconstruction experiment for two double stars is reported.

  8. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  9. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  10. Reconstruction of absolute absorption spectrum of reduced heme a in cytochrome C oxidase from bovine heart.

    PubMed

    Dyuba, A V; Vygodina, T V; Konstantinov, A A

    2013-12-01

    This paper presents a new experimental approach for determining the individual optical characteristics of reduced heme a in bovine heart cytochrome c oxidase starting from a small selective shift of the heme a absorption spectrum induced by calcium ions. The difference spectrum induced by Ca2+ corresponds actually to a first derivative (differential) of the heme a(2+) absolute absorption spectrum. Such an absolute spectrum was obtained for the mixed-valence cyanide complex of cytochrome oxidase (a(2+)a3(3+)-CN) and was subsequently used as a basis spectrum for further procession and modeling. The individual absorption spectrum of the reduced heme a in the Soret region was reconstructed as the integral of the difference spectrum induced by addition of Ca2+. The spectrum of heme a(2+) in the Soret region obtained in this way is characterized by a peak with a maximum at 447 nm and half-width of 17 nm and can be decomposed into two Gaussians with maxima at 442 and 451 nm and half-widths of ~10 nm (589 cm(-1)) corresponding to the perpendicularly oriented electronic π→π* transitions B0x and B0y in the porphyrin ring. The reconstructed spectrum in the Soret band differs significantly from the "classical" absorption spectrum of heme a(2+) originally described by Vanneste (Vanneste, W. H. (1966) Biochemistry, 65, 838-848). The differences indicate that the overall γ-band of heme a(2+) in cytochrome oxidase contains in addition to the B0x and B0y transitions extra components that are not sensitive to calcium ions, or, alternatively, that the Vanneste's spectrum of heme a(2+) contains significant contribution from heme a3(2+). The reconstructed absorption band of heme a(2+) in the α-band with maximum at 605 nm and half-width of 18 nm (850 cm(-1)) corresponds most likely to the individual Q0y transition of heme a, whereas the Q0x transition contributes only weakly to the spectrum.

  11. SU-E-T-109: Development of An End-To-End Test for the Varian TrueBeamtm with a Novel Multiple-Dosimetric Modality H and N Phantom

    SciTech Connect

    Zakjevskii, V; Knill, C; Rakowski, J; Snyder, M

    2014-06-01

    Purpose: To develop a comprehensive end-to-end test for Varian's TrueBeam linear accelerator for head and neck IMRT using a custom phantom designed to utilize multiple dosimetry devices. Methods: The initial end-to-end test and custom H and N phantom were designed to yield maximum information in anatomical regions significant to H and N plans with respect to: i) geometric accuracy, ii) dosimetric accuracy, and iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. A CT image was taken with a 1mm slice thickness. The CT was imported into Varian's Eclipse treatment planning system, where OARs and the PTV were contoured. A clinical template was used to create an eight field static gantry angle IMRT plan. After optimization, dose was calculated using the Analytic Anisotropic Algorithm with inhomogeneity correction. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film and ion chambers. Ion chamber dose measurements were compared to the TPS. Films were analyzed with FilmQAPro using composite gamma index. Results: Film analysis for the initial end-to-end plan with a geometrically simple PTV showed average gamma pass rates >99% with a passing criterion of 3% / 3mm. Film analysis of a plan with a more realistic, ie. complex, PTV yielded pass rates >99% in clinically important regions containing the PTV, spinal cord and parotid glands. Ion chamber measurements were on average within 1.21% of calculated dose for both plans. Conclusion: trials have demonstrated that our end-to-end testing methods provide baseline values for the dosimetric and geometric accuracy of Varian's TrueBeam system.

  12. Monte Carlo simulations for 20 MV X-ray spectrum reconstruction of a linear induction accelerator

    NASA Astrophysics Data System (ADS)

    Wang, Yi; Li, Qin; Jiang, Xiao-Guo

    2012-09-01

    To study the spectrum reconstruction of the 20 MV X-ray generated by the Dragon-I linear induction accelerator, the Monte Carlo method is applied to simulate the attenuations of the X-ray in the attenuators of different thicknesses and thus provide the transmission data. As is known, the spectrum estimation from transmission data is an ill-conditioned problem. The method based on iterative perturbations is employed to derive the X-ray spectra, where initial guesses are used to start the process. This algorithm takes into account not only the minimization of the differences between the measured and the calculated transmissions but also the smoothness feature of the spectrum function. In this work, various filter materials are put to use as the attenuator, and the condition for an accurate and robust solution of the X-ray spectrum calculation is demonstrated. The influences of the scattering photons within different intervals of emergence angle on the X-ray spectrum reconstruction are also analyzed.

  13. Reconstructing metabolic flux vectors from extreme pathways: defining the alpha-spectrum.

    PubMed

    Wiback, Sharon J; Mahadevan, Radhakrishnan; Palsson, Bernhard Ø

    2003-10-07

    The move towards genome-scale analysis of cellular functions has necessitated the development of analytical (in silico) methods to understand such large and complex biochemical reaction networks. One such method is extreme pathway analysis that uses stoichiometry and thermodynamic irreversibly to define mathematically unique, systemic metabolic pathways. These extreme pathways form the edges of a high-dimensional convex cone in the flux space that contains all the attainable steady state solutions, or flux distributions, for the metabolic network. By definition, any steady state flux distribution can be described as a nonnegative linear combination of the extreme pathways. To date, much effort has been focused on calculating, defining, and understanding these extreme pathways. However, little work has been performed to determine how these extreme pathways contribute to a given steady state flux distribution. This study represents an initial effort aimed at defining how physiological steady state solutions can be reconstructed from a network's extreme pathways. In general, there is not a unique set of nonnegative weightings on the extreme pathways that produce a given steady state flux distribution but rather a range of possible values. This range can be determined using linear optimization to maximize and minimize the weightings of a particular extreme pathway in the reconstruction, resulting in what we have termed the alpha-spectrum. The alpha-spectrum defines which extreme pathways can and cannot be included in the reconstruction of a given steady state flux distribution and to what extent they individually contribute to the reconstruction. It is shown that accounting for transcriptional regulatory constraints can considerably shrink the alpha-spectrum. The alpha-spectrum is computed and interpreted for two cases; first, optimal states of a skeleton representation of core metabolism that include transcriptional regulation, and second for human red blood cell

  14. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    SciTech Connect

    Gu, Renliang E-mail: ald@iastate.edu; Dogandžić, Aleksandar E-mail: ald@iastate.edu

    2015-03-31

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of the density map image in the wavelet domain. This algorithm alternates between a Nesterov’s proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.

  15. Results on the primary CR spectrum and composition reconstructed with the SPHERE-2 detector

    NASA Astrophysics Data System (ADS)

    Antonov, R. A.; Beschapov, S. P.; Bonvech, E. A.; Chernov, D. V.; Dzhatdoev, T. A.; Finger, Mir; Finger, Mix; Galkin, V. I.; Kabanova, N. V.; Petkun, A. S.; Podgrudkov, D. A.; Roganova, T. M.; Shaulov, S. B.; Sysoeva, T. I.

    2013-02-01

    First preliminary results of the balloon-borne experiment SPHERE-2 on the all-nuclei primary cosmic rays (PCR) spectrum and primary composition are presented. The primary spectrum in the energy range 1016-5 · 1017 eV was reconstructed using characteristics of Vavilov-Cherenkov radiation of extensive air showers (EAS), reflected from a snow surface. Several sources of systematic uncertainties of the spectrum were analysed. A method for separation of the primary nuclei' groups based on the lateral distribution function' (LDF) steepness parameter is presented. Preliminary estimate of the mean light nuclei' fraction f30-150 at energies 3 · 1016-1.5 · 1017 eV was performed and yielded f30-150 = (21±11) %.

  16. Red, Straight, no bends: primordial power spectrum reconstruction from CMB and large-scale structure

    NASA Astrophysics Data System (ADS)

    Ravenni, Andrea; Verde, Licia; Cuesta, Antonio J.

    2016-08-01

    We present a minimally parametric, model independent reconstruction of the shape of the primordial power spectrum. Our smoothing spline technique is well-suited to search for smooth features such as deviations from scale invariance, and deviations from a power law such as running of the spectral index or small-scale power suppression. We use a comprehensive set of the state-of the art cosmological data: Planck observations of the temperature and polarisation anisotropies of the cosmic microwave background, WiggleZ and Sloan Digital Sky Survey Data Release 7 galaxy power spectra and the Canada-France-Hawaii Lensing Survey correlation function. This reconstruction strongly supports the evidence for a power law primordial power spectrum with a red tilt and disfavours deviations from a power law power spectrum including small-scale power suppression such as that induced by significantly massive neutrinos. This offers a powerful confirmation of the inflationary paradigm, justifying the adoption of the inflationary prior in cosmological analyses.

  17. Optimizing end-to-end system performance for millimeter and submillimeter spectroscopy of protostars : wideband heterodyne receivers and sideband-deconvolution techniques for rapid molecular-line surveys

    NASA Astrophysics Data System (ADS)

    Sumner, Matthew Casey

    signal and broader tuning range of the Gunn continue to make it the preferred choice. The receiver and high-resolution spectrometer system were brought into a fully operational state late in 2007, when they were used to perform unbiased molecular-line surveys of several galactic sources, including the Orion KL hot core and a position in the L1157 outflow. In order to analyze these data, a new data pipeline was needed to deconvolve the double-sideband signals from the receiver and to model the molecular spectra. A highly automated sideband-deconvolution system has been created, and spectral-analysis tools are currently being developed. The sideband deconvolution relies on chi-square minimization to determine the optimal single-sideband spectrum in the presence of unknown sideband-gain imbalances and spectral baselines. Analytic results are presented for several different methods of approaching the problem, including direct optimization, nonlinear root finding, and a hybrid approach that utilizes a two-stage process to separate out the relatively weak nonlinearities so that the majority of the parameters can be found with a fast linear solver. Analytic derivations of the Jacobian matrices for all three cases are presented, along with a new Mathematica utility that enables the calculation of arbitrary gradients. The direct-optimization method has been incorporated into software, along with a spectral simulation engine that allows different deconvolution scenarios to be tested. The software has been validated through the deconvolution of simulated data sets, and initial results from L1157 and Orion are presented. Both surveys demonstrate the power of the wideband receivers and improved data pipeline to enable exciting scientific studies. The L1157 survey was completed in only 20 hours of telescope time and offers moderate sensitivity over a > 50-GHz range, from 220 GHz to approximately 270 or 280 GHz. The speed with which this survey was completed implies that the new

  18. Reconstruction of a nonminimal coupling theory with scale-invariant power spectrum

    SciTech Connect

    Qiu, Taotao

    2012-06-01

    A nonminimal coupling single scalar field theory, when transformed from Jordan frame to Einstein frame, can act like a minimal coupling one. Making use of this property, we investigate how a nonminimal coupling theory with scale-invariant power spectrum could be reconstructed from its minimal coupling counterpart, which can be applied in the early universe. Thanks to the coupling to gravity, the equation of state of our universe for a scale-invariant power spectrum can be relaxed, and the relation between the parameters in the action can be obtained. This approach also provides a means to address the Big-Bang puzzles and anisotropy problem in the nonminimal coupling model within Jordan frame. Due to the equivalence between the two frames, one may be able to find models that are free of the horizon, flatness, singularity as well as anisotropy problems.

  19. Demonstration of end-to-end cloud-DSL with a PON-based fronthaul supporting 5.76-Gb/s throughput with 48 eCDMA-encoded 1024-QAM discrete multi-tone signals.

    PubMed

    Fang, Liming; Zhou, Lei; Liu, Xiang; Zhang, Xiaofeng; Sui, Meng; Effenberger, Frank; Zhou, Jun

    2015-05-18

    We experimentally demonstrate an end-to-end ultra-broadband cloud-DSL network using passive optical network (PON) based fronthaul with electronic code-division-multiple-access (eCDMA) encoding and decoding. Forty-eight signals that are compliant with the very-high-bit-rate digital subscriber line 2 (VDSL2) standard are transmitted with a record throughput of 5.76 Gb/s over a hybrid link consisting of a 20-km standard single-mode fiber and a 100-m twisted pair.

  20. [A Method to Reconstruct Surface Reflectance Spectrum from Multispectral Image Based on Canopy Radiation Transfer Model].

    PubMed

    Zhao, Yong-guang; Ma, Ling-ling; Li, Chuan-rong; Zhu, Xiao-hua; Tang, Ling-li

    2015-07-01

    Due to the lack of enough spectral bands for multi-spectral sensor, it is difficult to reconstruct surface retlectance spectrum from finite spectral information acquired by multi-spectral instrument. Here, taking into full account of the heterogeneity of pixel from remote sensing image, a method is proposed to simulate hyperspectral data from multispectral data based on canopy radiation transfer model. This method first assumes the mixed pixels contain two types of land cover, i.e., vegetation and soil. The sensitive parameters of Soil-Leaf-Canopy (SLC) model and a soil ratio factor were retrieved from multi-spectral data based on Look-Up Table (LUT) technology. Then, by combined with a soil ratio factor, all the parameters were input into the SLC model to simulate the surface reflectance spectrum from 400 to 2 400 nm. Taking Landsat Enhanced Thematic Mapper Plus (ETM+) image as reference image, the surface reflectance spectrum was simulated. The simulated reflectance spectrum revealed different feature information of different surface types. To test the performance of this method, the simulated reflectance spectrum was convolved with the Landsat ETM + spectral response curves and Moderate Resolution Imaging Spectrometer (MODIS) spectral response curves to obtain the simulated Landsat ETM+ and MODIS image. Finally, the simulated Landsat ETM+ and MODIS images were compared with the observed Landsat ETM+ and MODIS images. The results generally showed high correction coefficients (Landsat: 0.90-0.99, MODIS: 0.74-0.85) between most simulated bands and observed bands and indicated that the simulated reflectance spectrum was well simulated and reliable.

  1. Reconstruction of a broadband spectrum of Alfvénic fluctuations

    SciTech Connect

    Viñas, Adolfo F; Moya, Pablo S.; Maneva, Yana G.; Araneda, Jaime A.

    2014-05-10

    Alfvénic fluctuations in the solar wind exhibit a high degree of velocities and magnetic field correlations consistent with Alfvén waves propagating away and toward the Sun. Two remarkable properties of these fluctuations are the tendencies to have either positive or negative magnetic helicity (–1 ≤ σ {sub m} ≤ +1) associated with either left- or right- topological handedness of the fluctuations and to have a constant magnetic field magnitude. This paper provides, for the first time, a theoretical framework for reconstructing both the magnetic and velocity field fluctuations with a divergence-free magnetic field, with any specified power spectral index and normalized magnetic- and cross-helicity spectrum field fluctuations for any plasma species. The spectrum is constructed in the Fourier domain by imposing two conditions—a divergence-free magnetic field and the preservation of the sense of magnetic helicity in both spaces—as well as using Parseval's theorem for the conservation of energy between configuration and Fourier spaces. Applications to the one-dimensional spatial Alfvénic propagation are presented. The theoretical construction is in agreement with typical time series and power spectra properties observed in the solar wind. The theoretical ideas presented in this spectral reconstruction provide a foundation for more realistic simulations of plasma waves, solar wind turbulence, and the propagation of energetic particles in such fluctuating fields.

  2. A comparative study of red and blue light-emitting diodes and low-level laser in regeneration of the transected sciatic nerve after an end to end neurorrhaphy in rabbits.

    PubMed

    Takhtfooladi, Mohammad Ashrafzadeh; Sharifi, Davood

    2015-12-01

    This study aimed at evaluating the effects of red and blue light-emitting diodes (LED) and low-level laser (LLL) on the regeneration of the transected sciatic nerve after an end-to-end neurorrhaphy in rabbits. Forty healthy mature male New Zealand rabbits were randomly assigned into four experimental groups: control, LLL (680 nm), red LED (650 nm), and blue LED (450 nm). All animals underwent the right sciatic nerve neurotmesis injury under general anesthesia and end-to-end anastomosis. The phototherapy was initiated on the first postoperative day and lasted for 14 consecutive days at the same time of the day. On the 30th day post-surgery, the animals whose sciatic nerves were harvested for histopathological analysis were euthanized. The nerves were analyzed and quantified the following findings: Schwann cells, large myelinic axons, and neurons. In the LLL group, as compared to other groups, an increase in the number of all analyzed aspects was observed with significance level (P < 0.05). This finding suggests that postoperative LLL irradiation was able to accelerate and potentialize the peripheral nerve regeneration process in rabbits within 14 days of irradiation.

  3. Reconstruction of the primordial power spectrum of curvature perturbations using multiple data sets

    NASA Astrophysics Data System (ADS)

    Hunt, Paul; Sarkar, Subir

    2014-01-01

    Detailed knowledge of the primordial power spectrum of curvature perturbations is essential both in order to elucidate the physical mechanism (`inflation') which generated it, and for estimating the cosmological parameters from observations of the cosmic microwave background and large-scale structure. Hence it ought to be extracted from such data in a model-independent manner, however this is difficult because relevant cosmological observables are given by a convolution of the primordial perturbations with some smoothing kernel which depends on both the assumed world model and the matter content of the universe. Moreover the deconvolution problem is ill-conditioned so a regularisation scheme must be employed to control error propagation. We demonstrate that `Tikhonov regularisation' can robustly reconstruct the primordial spectrum from multiple cosmological data sets, a significant advantage being that both its uncertainty and resolution are then quantified. Using Monte Carlo simulations we investigate several regularisation parameter selection methods and find that generalised cross-validation and Mallow's Cp method give optimal results. We apply our inversion procedure to data from the Wilkinson Microwave Anisotropy Probe, other ground-based small angular scale CMB experiments, and the Sloan Digital Sky Survey. The reconstructed spectrum (assuming the standard ΛCDM cosmology) is not scale-free but has an infrared cutoff at klesssim5 × 10-4 Mpc-1 (due to the anomalously low CMB quadrupole) and several features with ~ 2σ significance at k/Mpc-1 ~ 0.0013-0.0025, 0.0362-0.0402 and 0.051-0.056, reflecting the `WMAP glitches'. To test whether these are indeed real will require more accurate data, such as from the Planck satellite and new ground-based experiments.

  4. Reconstruction of the primordial power spectrum of curvature perturbations using multiple data sets

    SciTech Connect

    Hunt, Paul; Sarkar, Subir E-mail: s.sarkar@physics.ox.ac.uk

    2014-01-01

    Detailed knowledge of the primordial power spectrum of curvature perturbations is essential both in order to elucidate the physical mechanism ('inflation') which generated it, and for estimating the cosmological parameters from observations of the cosmic microwave background and large-scale structure. Hence it ought to be extracted from such data in a model-independent manner, however this is difficult because relevant cosmological observables are given by a convolution of the primordial perturbations with some smoothing kernel which depends on both the assumed world model and the matter content of the universe. Moreover the deconvolution problem is ill-conditioned so a regularisation scheme must be employed to control error propagation. We demonstrate that 'Tikhonov regularisation' can robustly reconstruct the primordial spectrum from multiple cosmological data sets, a significant advantage being that both its uncertainty and resolution are then quantified. Using Monte Carlo simulations we investigate several regularisation parameter selection methods and find that generalised cross-validation and Mallow's C{sub p} method give optimal results. We apply our inversion procedure to data from the Wilkinson Microwave Anisotropy Probe, other ground-based small angular scale CMB experiments, and the Sloan Digital Sky Survey. The reconstructed spectrum (assuming the standard ΛCDM cosmology) is not scale-free but has an infrared cutoff at k∼<5 × 10{sup −4} Mpc{sup −1} (due to the anomalously low CMB quadrupole) and several features with ∼ 2σ significance at k/Mpc{sup −1} ∼ 0.0013–0.0025, 0.0362–0.0402 and 0.051–0.056, reflecting the 'WMAP glitches'. To test whether these are indeed real will require more accurate data, such as from the Planck satellite and new ground-based experiments.

  5. Rutile TiO2(011)-2 × 1 Reconstructed Surfaces with Optical Absorption over the Visible Light Spectrum.

    PubMed

    Zhou, Rulong; Li, Dongdong; Qu, Bingyan; Sun, Xiaorui; Zhang, Bo; Zeng, Xiao Cheng

    2016-10-12

    The stable structures of the reconstructed rutile TiO2(011) surface are explored based on an evolutionary method. In addition to the well-known "brookite(001)-like" 2 × 1 reconstruction model, three 2 × 1 reconstruction structures are revealed for the first time, all being more stable in the high Ti-rich condition. Importantly, the predicted Ti4O4-2 × 1 surface model not only is in excellent agreement with the reconstructed metastable surface detected by Tao et al. [Nat. Chem. 3, 296 (2011)] from their STM experiment but also gives a consistent formation mechanism and electronic structures with the measured surface. The computed imaginary part of the dielectric function suggests that the newly predicted reconstructed surfaces are capable of optical absorption over the entire visible light spectrum, thereby offering high potential for photocatalytic applications.

  6. Reconstruction of the Primary Energy Spectrum from Fluorescence Telescope Data of the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Geenen, H.

    2007-07-01

    The Pierre Auger Observatory is the largest extensive air-shower (EAS) experiment in operation. It is still being constructed, and the final configuration will have detectors at the two sites Argentina and USA observing both celestial hemispheres. The aim of the experiment is to determine the energy, composition and origin of ultra-high energy cosmic-rays (UHECR) using two complementary detection techniques. The detector at the southern site presently contains more than 1400 (Jul. 2007) water-Cherenkov detectors at ground level (870 gcm^-2). Completion of the 3000 km^2 large detector array is expected by the end of 2007 with finally more than 1600 tanks. The atmosphere above the site is observed by 24 fluorescence telescopes located in four buildings at the boundary of the array. During clear moon-less nights, this configuration permits hybrid measurement of both longitudinal development of an EAS and lateral particle density at ground. All fluorescence telescopes are fully operational since February 2007. The aim of this work is to reconstruct the cosmic ray energy spectrum between a few 10^17 eV up to 10^20 eV. This would provide an overlap to spectral results from other experiments at lower energies. The hybrid detection provides an accurate geometry determination and thereby a good energy resolution. However, the energy threshold is limited to the threshold of the surface array: larger than a few 10^18 eV. The advantage of FD-monocular events (FD-mono) is a lower energy threshold in the aimed 10^17 eV regime. In addition, the present FD-mono exposure is about 1.5 times larger than the hybrid one. However, the energy resolution of FD-mono events is worse compared to hybrid, and the detector acceptance is strongly energy dependent. Therefore, the determination of the energy spectrum requires an unfolding procedure, which considers both the limited acceptance and the limited resolution. In this analysis the FD-mono data are reconstructed. The reconstruction

  7. Modified Reconstruction of Neutron Spectrum Emitted in Dense Plasma Focus Devices by MCNP Code and Monte-Carlo Method

    NASA Astrophysics Data System (ADS)

    Roomi, A.; Habibi, M.; Saion, E.; Amrollahi, R.

    2011-02-01

    In this study we present Monte Carlo method for obtaining the time-resolved energy spectra of neutrons emitted by D-D reaction in plasma focus devices. Angular positions of detectors obtained to maximum reconstruction of neutron spectrum. The detectors were arranged over a range of 0-22.5 m from the source and also at 0°, 30°, 60°, and 90° with respect to the central axis. The results show that an arrangement with five detectors placed at 0, 2, 7.5, 15 and 22.5 m around the central electrode of plasma focus as an anisotropic neutron source is required. As it shown in reconstructed spectrum, the distance between the neutron source and detectors is reduced and also the final reconstructed signal obtained with a very fine accuracy.

  8. [Novel method of noise power spectrum measurement for computed tomography images with adaptive iterative reconstruction method].

    PubMed

    Nishimaru, Eiji; Ichikawa, Katsuhiro; Hara, Takanori; Terakawa, Shoichi; Yokomachi, Kazushi; Fujioka, Chikako; Kiguchi, Masao; Ishifuro, Minoru

    2012-01-01

    Adaptive iterative reconstruction techniques (IRs) can decrease image noise in computed tomography (CT) and are expected to contribute to reduction of the radiation dose. To evaluate the performance of IRs, the conventional two-dimensional (2D) noise power spectrum (NPS) is widely used. However, when an IR provides an NPS value drop at all spatial frequency (which is similar to NPS changes by dose increase), the conventional method cannot evaluate the correct noise property because the conventional method does not correspond to the volume data natures of CT images. The purpose of our study was to develop a new method for NPS measurements that can be adapted to IRs. Our method utilized thick multi-planar reconstruction (MPR) images. The thick images are generally made by averaging CT volume data in a direction perpendicular to a MPR plane (e.g. z-direction for axial MPR plane). By using this averaging technique as a cutter for 3D-NPS, we can obtain adequate 2D-extracted NPS (eNPS) from 3D NPS. We applied this method to IR images generated with adaptive iterative dose reduction 3D (AIDR-3D, Toshiba) to investigate the validity of our method. A water phantom with 24 cm-diameters was scanned at 120 kV and 200 mAs with a 320-row CT (Acquilion One, Toshiba). From the results of study, the adequate thickness of MPR images for eNPS was more than 25.0 mm. Our new NPS measurement method utilizing thick MPR images was accurate and effective for evaluating noise reduction effects of IRs.

  9. Single-fraction spine SBRT end-to-end testing on TomoTherapy, Vero, TrueBeam, and CyberKnife treatment platforms using a novel anthropomorphic phantom.

    PubMed

    Gallo, John J; Kaufman, Isaac; Powell, Rachel; Pandya, Shalini; Somnay, Archana; Bossenberger, Todd; Ramirez, Ezequiel; Reynolds, Robert; Solberg, Timothy; Burmeister, Jay

    2015-01-08

    Spine SBRT involves the delivery of very high doses of radiation to targets adjacent to the spinal cord and is most commonly delivered in a single fraction. Highly conformal planning and accurate delivery of such plans is imperative for successful treatment without catastrophic adverse effects. End-to-end testing is an important practice for evaluating the entire treatment process from simulation through treatment delivery. We performed end-to-end testing for a set of representative spine targets planned and delivered using four different treatment planning systems (TPSs) and delivery systems to evaluate the various capabilities of each. An anthropomorphic E2E SBRT phantom was simulated and treated on each system to evaluate agreement between measured and calculated doses. The phantom accepts ion chambers in the thoracic region and radiochromic film in the lumbar region. Four representative targets were developed within each region (thoracic and lumbar) to represent different presentations of spinal metastases and planned according to RTOG 0631 constraints. Plans were created using the TomoTherapy TPS for delivery using the Hi·Art system, the iPlan TPS for delivery using the Vero system, the Eclipse TPS for delivery using the TrueBeam system in both flattened and flattening filter free (FFF), and the MultiPlan TPS for delivery using the CyberKnife system. Delivered doses were measured using a 0.007 cm3 ion chamber in the thoracic region and EBT3 GAFCHROMIC film in the lumbar region. Films were scanned and analyzed using an Epson Expression 10000XL flatbed scanner in conjunction with FilmQAPro2013. All treatment platforms met all dose constraints required by RTOG 0631. Ion chamber measurements in the thoracic targets delivered an overall average difference of 1.5%. Specifically, measurements agreed with the TPS to within 2.2%, 3.2%, 1.4%, 3.1%, and 3.0% for all three measureable cases on TomoTherapy, Vero, TrueBeam (FFF), TrueBeam (flattened), and Cyber

  10. A noise power spectrum study of a new model-based iterative reconstruction system: Veo 3.0.

    PubMed

    Li, Guang; Liu, Xinming; Dodge, Cristina T; Jensen, Corey T; Rong, X John

    2016-09-08

    The purpose of this study was to evaluate performance of the third generation of model-based iterative reconstruction (MBIR) system, Veo 3.0, based on noise power spectrum (NPS) analysis with various clinical presets over a wide range of clinically applicable dose levels. A CatPhan 600 surrounded by an oval, fat-equivalent ring to mimic patient size/shape was scanned 10 times at each of six dose levels on a GE HD 750 scanner. NPS analysis was performed on images reconstructed with various Veo 3.0 preset combinations for comparisons of those images reconstructed using Veo 2.0, filtered back projection (FBP) and adaptive statistical iterative reconstruc-tion (ASiR). The new Target Thickness setting resulted in higher noise in thicker axial images. The new Texture Enhancement function achieved a more isotropic noise behavior with less image artifacts. Veo 3.0 provides additional reconstruction options designed to allow the user choice of balance between spatial resolution and image noise, relative to Veo 2.0. Veo 3.0 provides more user selectable options and in general improved isotropic noise behavior in comparison to Veo 2.0. The overall noise reduction performance of both versions of MBIR was improved in comparison to FBP and ASiR, especially at low-dose levels.

  11. Reconstruction of the first derivative EPR spectrum from multiple harmonics of the field-modulated continuous wave signal

    PubMed Central

    Tseitlin, Mark; Eaton, Sandra S.; Eaton, Gareth R.

    2011-01-01

    Selection of the amplitude of magnetic field modulation for continuous wave electron paramagnetic resonance (EPR) often is a trade-off between sensitivity and resolution. Increasing the modulation amplitude improves the signal-to-noise ratio, S/N, at the expense of broadening the signal. Combining information from multiple harmonics of the field-modulated signal is proposed as a method to obtain the first derivative spectrum with minimal broadening and improved signal-to-noise. The harmonics are obtained by digital phase-sensitive detection of the signal at the modulation frequency and its integer multiples. Reconstruction of the first derivative EPR line is done in the Fourier conjugate domain where each harmonic can be represented as the product of the Fourier transform of the 1st derivative signal with an analytical function. The analytical function for each harmonic can be viewed as a filter. The Fourier transform of the 1st derivative spectrum can be calculated from all available harmonics by solving an optimization problem with the goal of maximizing the S/N. Inverse Fourier transformation of the result produces the 1st derivative EPR line in the magnetic field domain. The use of modulation amplitude greater than linewidth improves the S/N, but does not broaden the reconstructed spectrum. The method works for an arbitrary EPR line shape, but is limited to the case when magnetization instantaneously follows the modulation field, which is known as the adiabatic approximation. PMID:21349750

  12. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    SciTech Connect

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak shear

  13. The Dosimetric Importance of Six Degree of Freedom Couch End to End Quality Assurance for SRS/SBRT Treatments when Comparing Intensity Modulated Radiation Therapy to Volumetric Modulated Arc Therapy

    NASA Astrophysics Data System (ADS)

    Ulizio, Vincent Michael

    With the advancement of technology there is an increasing ability for lesions to be treated with higher radiation doses each fraction. This also allows for low fractionated treatments. Because the patient is receiving a higher dose of radiation per fraction and because of the fast dose falloff in these targets there must be extreme accuracy in the delivery. The 6 DOF couch allows for extra rotational corrections and for a more accurate set-up. The movement of the couch needs to be verified to be accurate and because of this, end to end quality assurance tests for the couch have been made. After the set-up is known to be accurate then different treatment techniques can be studied. SBRT of the Spine has a very fast dose falloff near the spinal cord and was typically treated with IMRT. Treatment plans generated using this technique tend to have streaks of low dose radiation, so VMAT is being studied to determine if this treatment technique can reduce the low dose radiation volume as well as improve OAR sparing. For the 6 DOF couch QA, graph paper is placed on the anterior and right lateral sides of the VisionRT OSMS Cube Phantom. Each rotational shift is then applied individually, with a 3 degree shift in the positive and negative directions for pitch and roll. A mark is drawn on the paper to record each shift. A CBCT is then taken of the Cube and known shifts are applied and then an additional CBCT is taken to return the Cube to isocenter. The original IMRT plans for SBRT of the Spine are evaluated and then a plan is made utilizing VMAT. These plans are then compared for low dose radiation, OAR sparing, and conformity. If the original IMRT plan is determined to be an inferior treatment to what is acceptable, then this will be re-planned and compared to the VMAT plan. The 6 DOF couch QA tests have proven to be accurate and reproducible. The average deviations in the 3 degree and -3 degree pitch and roll directions were 0.197, 0.068, 0.091, and 0.110 degrees

  14. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    NASA Astrophysics Data System (ADS)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data

  15. The Singular Spectrum Analysis method and its application to seismic data denoising and reconstruction

    NASA Astrophysics Data System (ADS)

    Oropeza, Vicente E.

    Attenuating random and coherent noise is an important part of seismic data processing. Successful removal results in an enhanced image of the subsurface geology, which facilitate economical decisions in hydrocarbon exploration. This motivates the search for new and more efficient techniques for noise removal. The main goal of this thesis is to present an overview of the Singular Spectrum Analysis (SSA) technique, studying its potential application to seismic data processing. An overview of the application of SSA for time series analysis is presented. Subsequently, its applications for random and coherenet noise attenuation, expansion to multiple dimensions, and for the recovery of unrecorded seismograms are described. To improve the performance of SSA, a faster implementation via a randomized singular value decomposition is proposed. Results obtained in this work show that SSA is a versatile method for both random and coherent noise attenuation, as well as for the recovery of missing traces.

  16. Reconstructing the primordial spectrum of fluctuations of the universe from the observed nonlinear clustering of galaxies

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.; Matthews, Alex; Kumar, P.; Lu, Edward

    1991-01-01

    It was discovered that the nonlinear evolution of the two point correlation function in N-body experiments of galaxy clustering with Omega = 1 appears to be described to good approximation by a simple general formula. The underlying form of the formula is physically motivated, but its detailed representation is obtained empirically by fitting to N-body experiments. In this paper, the formula is presented along with an inverse formula which converts a final, nonlinear correlation function into the initial linear correlation function. The inverse formula is applied to observational data from the CfA, IRAs, and APM galaxy surveys, and the initial spectrum of fluctuations of the universe, if Omega = 1.

  17. Reconstruction of the energy spectrum of electrons accelerated in the April 15, 2002 solar flare based on IRIS X-ray spectrometer measurements

    NASA Astrophysics Data System (ADS)

    Motorina, G. G.; Kudryavtsev, I. V.; Lazutkov, V. P.; Savchenko, M. I.; Skorodumov, D. V.; Charikov, Yu. E.

    2016-04-01

    We reconstruct the energy distribution of electrons accelerated in the April 15, 2002 solar flare on the basis of the data from the IRIS X-ray spectrometer onboard the CORONAS-F satellite. We obtain the solution to the integral equations describing the transformation of the spectrum of X-ray photons during the recording and reconstruction of the spectrum of accelerated electrons in the bremsstrahlung source using the random search method and the Tikhonov regularization method. In this event, we detected a singularity in the electron spectrum associated with the existence of a local minimum in the energy range 40-60 keV, which cannot be detected by a direct method.

  18. Going End to End to Deliver High-Speed Data

    NASA Technical Reports Server (NTRS)

    2005-01-01

    By the end of the 1990s, the optical fiber "backbone" of the telecommunication and data-communication networks had evolved from megabits-per-second transmission rates to gigabits-per-second transmission rates. Despite this boom in bandwidth, however, users at the end nodes were still not being reached on a consistent basis. (An end node is any device that does not behave like a router or a managed hub or switch. Examples of end node objects are computers, printers, serial interface processor phones, and unmanaged hubs and switches.) The primary reason that prevents bandwidth from reaching the end nodes is the complex local network topology that exists between the optical backbone and the end nodes. This complex network topology consists of several layers of routing and switch equipment which introduce potential congestion points and network latency. By breaking down the complex network topology, a true optical connection can be achieved. Access Optical Networks, Inc., is making this connection a reality with guidance from NASA s nondestructive evaluation experts.

  19. End-to-end experiment management in HPC

    SciTech Connect

    Bent, John M; Kroiss, Ryan R; Torrez, Alfred; Wingate, Meghan

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  20. End-to-End Performance Management for Large Distributed Storage

    SciTech Connect

    Almadena Chtchelkanova

    2012-03-18

    Storage systems for large distributed clusters of computer servers are themselves large and distributed. Their complexity and scale make it hard to ensure that applications using them get good, predictable performance. At the same time, shared access to the system from multiple applications, users, and internal system activities leads to a need for predictable performance. This research investigates mechanisms for improving storage system performance in large distributed storage systems through mechanisms that integrate the performance aspects of the path that I/O operations take through the system, from the application interface on the compute server, through the network, to the storate servers. The research focuses on five parts of the I/O path in a distributed storage system: I/O scheduling at the storage server, storage server cache management, client-to-server network flow control, client-to-server connection management, and client cache management.

  1. End-to-End Service Oriented Architectures (SOA) Security Project

    DTIC Science & Technology

    2012-02-01

    architecture for SOA. • A novel service invocation control mechanism for SOA using dynamic taint analysis (TA). • A trust broker (TB) system that maintains...architectures (SOAs) because the SOAs stress on machine-to-machine interactions, while most of the IT security mechanisms are based on human-to- machine...listing services worldwide. It is a standard mechanism for registering or publishing and discovering Web services. The services published into the UDDI

  2. Unmanned Aerial Vehicle End-to-End Support Considerations

    DTIC Science & Technology

    2005-01-01

    approximately 2 hours at an altitude of 500 ft, has a maximum airspeed of 88 kts, and carries payloads weighing a maximum of 1 lb (AeroVironment, undated...addition, each ground station costs $16 million. 2 The high altitude and the long operational radius allow great survivability and operational flexibility...takeoff weight (lbs) 2,250 10,000 Speed at altitude (kts) Loiter 70 200 Maximum 120 220 Wingspan (ft) 48.7 64.0 Maximum payload (lbs) Internal 450 750

  3. Ordered End-to-End Multicast for Distributed Multimedia Systems

    DTIC Science & Technology

    2006-01-01

    Ordering of messages compensates for the lack of a global system state and the effects of asynchrony, unpredictable network delay, and disparities in...separate logical propagation graph or global clock synchronization, and ordering is dis- tributed across nodes on the delivery paths between sources...set on route by PN, deciding on a globally valid num- ber, and multicasting the message to the receiver set with a final and binding sequence number

  4. On Estimating End-to-End Network Path Properties

    NASA Technical Reports Server (NTRS)

    Allman, Mark; Paxson, Vern

    1999-01-01

    The more information about current network conditions available to a transport protocol, the more efficiently it can use the network to transfer its data. In networks such as the Internet, the transport protocol must often form its own estimates of network properties based on measurements per-formed by the connection endpoints. We consider two basic transport estimation problems: determining the setting of the retransmission timer (RTO) for are reliable protocol, and estimating the bandwidth available to a connection as it begins. We look at both of these problems in the context of TCP, using a large TCP measurement set [Pax97b] for trace-driven simulations. For RTO estimation, we evaluate a number of different algorithms, finding that the performance of the estimators is dominated by their minimum values, and to a lesser extent, the timer granularity, while being virtually unaffected by how often round-trip time measurements are made or the settings of the parameters in the exponentially-weighted moving average estimators commonly used. For bandwidth estimation, we explore techniques previously sketched in the literature [Hoe96, AD98] and find that in practice they perform less well than anticipated. We then develop a receiver-side algorithm that performs significantly better.

  5. SU-F-18C-02: Evaluations of the Noise Power Spectrum of a CT Iterative Reconstruction Technique for Radiation Therapy

    SciTech Connect

    Dolly, S; Chen, H; Anastasio, M; Mutic, S; Li, H

    2014-06-15

    Purpose: To quantitatively assess the noise power spectrum (NPS) of the new, commercially released CT iterative reconstruction technique, iDose{sup 4} from Philips, to compare it with filtered back-projection techniques (FBP), and to provide clinical practice suggestions for radiation therapy. Methods: A uniform phantom was CT imaged with 120kVp tube potential over a range of mAs (250-3333). The image sets were reconstructed using two reconstruction algorithms (FBP and iDose{sup 4} with noise reduction levels 1, 3, and 6) and three reconstruction filters (standard B, smooth A, and sharp C), after which NPS variations were analyzed and compared on region of interest (ROI) sizes (16×16 to 128×128 pixels), ROI radii (0–65 mm), reconstruction algorithms, reconstruction filters, and mAs. Results: The NPS magnitude and shape depended considerably on ROI size and location for both reconstruction algorithms. Regional noise variance became more stationary as ROI size decreased, minimizing NPS artifacts. The optimal 32×32-pixel ROI size balanced the trade-off between stationary noise and adequate sampling. NPS artifacts were greatest at the center of reconstruction space and decreased with increasing ROI distance from the center. The optimal ROI position was located near the phantom's radial midpoint (∼40mm). For sharper filters, the NPS magnitude and the maximum magnitude frequency increased. Higher dose scans yielded lower NPS magnitudes for both reconstruction algorithms and all filters. Compared to FBP, the iDose{sup 4} algorithm reduced the NPS magnitude while preferentially reducing noise at mid-range spatial frequencies, altering noise texture. This reduction was more significant with increasing iDose{sup 4} noise reduction level. Conclusion: Compared to pixel standard deviation, NPS has greater clinical potential for task-based image quality assessment, describing both the magnitude and spatial frequency characteristics of image noise. While iDose{sup 4

  6. Reconstruction of the absorption spectrum of an object spot from the colour values of the corresponding pixel(s) in its digital image: the challenge of algal colours.

    PubMed

    Coltelli, Primo; Barsanti, Laura; Evangelista, Valter; Frassanito, Anna Maria; Gualtieri, Paolo

    2016-12-01

    A novel procedure for deriving the absorption spectrum of an object spot from the colour values of the corresponding pixel(s) in its image is presented. Any digital image acquired by a microscope can be used; typical applications are the analysis of cellular/subcellular metabolic processes under physiological conditions and in response to environmental stressors (e.g. heavy metals), and the measurement of chromophore composition, distribution and concentration in cells. In this paper, we challenged the procedure with images of algae, acquired by means of a CCD camera mounted onto a microscope. The many colours algae display result from the combinations of chromophores whose spectroscopic information is limited to organic solvents extracts that suffers from displacements, amplifications, and contraction/dilatation respect to spectra recorded inside the cell. Hence, preliminary processing is necessary, which consists of in vivo measurement of the absorption spectra of photosynthetic compartments of algal cells and determination of spectra of the single chromophores inside the cell. The final step of the procedure consists in the reconstruction of the absorption spectrum of the cell spot from the colour values of the corresponding pixel(s) in its digital image by minimization of a system of transcendental equations based on the absorption spectra of the chromophores under physiological conditions.

  7. A broad-spectrum sunscreen prevents UVA radiation-induced gene expression in reconstructed skin in vitro and in human skin in vivo.

    PubMed

    Marionnet, Claire; Grether-Beck, Susanne; Seité, Sophie; Marini, Alessandra; Jaenicke, Thomas; Lejeune, François; Bastien, Philippe; Rougier, André; Bernerd, Françoise; Krutmann, Jean

    2011-06-01

    The efficacy of sunscreens to protect against ultraviolet (UV) A radiation is usually assessed by measuring erythema formation and pigmentation. The biological relevance of these endpoints for UVA-induced skin damage, however, is not known. We therefore carried out two complementary studies to determine UVA protection provided by a broad-spectrum sunscreen product at a molecular level by studying UVA radiation-induced gene expression. One study was performed on human reconstructed skin in vitro with a semi-global gene expression analysis of 227 genes in fibroblasts and 244 in keratinocytes. The second one was conducted in vivo in human volunteers and focused on genes involved in oxidative stress response and photo-ageing (haeme oxygenase-1, superoxide dismutase-2, glutathione peroxidase, catalase, matrix metalloproteinase-1). In-vitro UVA radiation induced modulation of genes involved in extracellular matrix homeostasis, oxidative stress, heat shock responses, cell growth, inflammation and epidermal differentiation. Sunscreen pre-application abrogated or significantly reduced these effects, as underlined by unsupervised clustering analysis. The in vivo study confirmed that the sunscreen prevented UVA radiation-induced transcriptional expression of the five studied genes. These findings indicate the high efficacy of a broad-spectrum sunscreen in protecting human skin against UVA-induced gene responses and suggest that this approach is a biologically relevant complement to existing methods.

  8. Reconstruction of the radionuclide spectrum of liquid radioactive waste released into the Techa river in 1949-1951.

    PubMed

    Mokrov, Yuri G

    2003-04-01

    The major part of the liquid radioactive waste released by the Mayak Production Association (PA) radiochemical plant into the Techa river occurred in 1949-1951, but there is information on only one single radiochemical analysis of a water sample taken on 24 and 25 September 1951. These data are here used to assess the spectrum of radionuclides that were released between 1949 and 1951. For this purpose, details of the radiochemical methods of radionuclide extraction and radiometric measurements of beta-activity used at Mayak PA in the 1950s have been taken into account. It is concluded that the data from the radiochemical measurements agree with the theoretical composition of fission products in uranium after exposure times in the reactor (120 days) and subsequent hold times (35 days) that were typical for the procedures at that time. The results of the analysis are at variance with assumptions that underlie the current Techa river dosimetry system. They confirm the conclusion that the external doses to the Techa river residents in the critical period up to 1952 were predominantly due to short-lived fission products.

  9. Successful liver allograft inflow reconstruction with the right gastroepiploic vein.

    PubMed

    Pinheiro, Rafael S; Cruz, Ruy J; Nacif, Lucas S; Vane, Matheus F; D'Albuquerque, Luiz A C

    2016-02-01

    Portal vein thrombosis is a common complication in cirrhotic patients. When portal vein thrombectomy is not a suitable option, a large collateral vessel can be used for allograft venous inflow reconstruction. We describe an unusual case of successful portal revascularization using the right gastroepiploic vein. The patient underwent a cadaveric orthotopic liver transplantation with end-to-end anastomosis of the portal vein to the right gastroepiploic vein. Six months after liver transplantation the patient is well with good liver function. The use of the right gastroepiploic vein for allograft venous reconstruction is feasible and safe, with a great advantage of avoiding the need of venous jump graft.

  10. SIRENA software for Athena X-IFU event reconstruction

    NASA Astrophysics Data System (ADS)

    Ceballos, M. T.; Cobo, B.; Peille, P.; Wilms, J.; Brand, T.; Dauser, T.; Bandler, S.; Smith, S.

    2017-03-01

    The X-ray Observatory Athena was proposed in April 2014 as the mission to implement the science theme "The Hot and Energetic Universe" selected by ESA for L2 (the second Large-class mission in ESA’s Cosmic Vision science programme). One of the two X-ray detectors designed to be onboard Athena is X-IFU, a cryogenic microcalorimeter based on Transition Edge Sensor (TES) technology that will provide spatially resolved high-resolution spectroscopy. X-IFU will be developed by an international consortium led by IRAP (PI), SRON (co-PI) and IAPS/INAF (co-PI) and involving ESA Member States, Japan and the United States. In Spain, IFCA (CSIC-UC) has an anticipated contribution to X-IFU through the Digital Readout Electronics (DRE) unit, in particular in the Event Processor Subsystem. For this purpose and in collaboration with the Athena end-to-end simulations team, we are currently developing the SIRENA package as part of the publicly available SIXTE end-to-end simulator. SIRENA comprises a set of processing algorithms aimed at recognizing, from a noisy signal, the intensity pulses generated by the absorption of the X-ray photons, to lately reconstruct their energy, position and arrival time. This poster describes the structure of the package and the different algorithms currently implemented as well as their comparative performance in the energy resolution achieved in the reconstruction of the instrument events.

  11. MONTAGE: A Methodology for Designing Composable End-to-End Secure Distributed Systems

    DTIC Science & Technology

    2012-08-01

    return codes, one per name var f = index of next available entry in the files[] array ; files[f].data=empty, files[f].Writers=Writers; // Allow the...permissions, which means that SimpFS allows every process to read every file. In more details, our ideal SimpFS maintains an array of files and an associative... array of names: files[] is an array of files (indexed by integers). Each entry is a file, consisting of an array of bytes (i.e., a data blob) and a

  12. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGES

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; ...

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  13. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    SciTech Connect

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-23

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  14. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data

    PubMed Central

    Storch, Laura S.; Glaser, Sarah M.; Ye, Hao; Rosenberg, Andrew A.

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model’s utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts. PMID:28199344

  15. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process.The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.

  16. Analysis of End-to-End Encryption and Traffic Flow Confidentiality Options

    DTIC Science & Technology

    1994-04-20

    Fiber Distributed Data Interface (FDDI) ............................. 3-27 3.2.4 Description of 802.3 Carrier Sense Multiple Access with Collision ...Data Interface (FDDI) * IEEE 802.3 Carrier Sense Multiple Access with Collision Detection (CSMAICD). Common protocols that are excluded from this study...NSA SDNS Security Protocol 3 (SP3) [NIST 90] • ISO 8208 - X.25 Packet Layer Protocol [ISO 90C] • CCITT Link Access Procedures - B (LAPB). [CCITT 88

  17. End-to-end simulation of the image stability for the airborne telescope SOFIA

    NASA Astrophysics Data System (ADS)

    Schoenhoff, Ulrich; Eisentraeger, Peter; Wandner, Karl; Kaercher, Hans J.; Nordmann, Rainer

    2000-06-01

    To provide astronomers access to infrared wavelength unavailable from the ground the airborne telescope SOFIA is in development. This paper focuses on the image stability of the telescope, its modeling and simulation. The operation of the telescope under the harsh environmental conditions in the aircraft makes the prediction of the image stability during the design process necessary. For this purpose an integrated mathematical simulation model, which includes the optics, the structural dynamics and the control loops has been constructed. Because of the high relevance of the structural dynamics for image stability and control design, special attention is paid to the import and reduction of the finite element model of the telescopes mechanical structure. Different control approaches are considered for the attitude control and the compensation of the impact of the structural flexibility on the image motion. Additionally the secondary mirror servo-mechanism is utilized to optimize the image stability. Simulation results are shown.

  18. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  19. Modeling and Simulation of Satellite Subsystems for End-to-End Spacecraft Modeling

    DTIC Science & Technology

    2006-04-01

    reaction wheel power requirements for a particular maneuver. The ADCS only calculates information at a specified time and passes it to SST; no future or...are calculated and fed into a reaction wheel model which will return the power requirements for the maneuver. In the case of calculating slew time

  20. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.

  1. Data compression: The end-to-end information systems perspective for NASA space science missions

    NASA Technical Reports Server (NTRS)

    Tai, Wallace

    1991-01-01

    The unique characteristics of compressed data have important implications to the design of space science data systems, science applications, and data compression techniques. The sequential nature or data dependence between each of the sample values within a block of compressed data introduces an error multiplication or propagation factor which compounds the effects of communication errors. The data communication characteristics of the onboard data acquisition, storage, and telecommunication channels may influence the size of the compressed blocks and the frequency of included re-initialization points. The organization of the compressed data are continually changing depending on the entropy of the input data. This also results in a variable output rate from the instrument which may require buffering to interface with the spacecraft data system. On the ground, there exist key tradeoff issues associated with the distribution and management of the science data products when data compression techniques are applied in order to alleviate the constraints imposed by ground communication bandwidth and data storage capacity.

  2. Emergence of Laplace therapeutics: declaring an end to end-stage heart failure.

    PubMed

    Mehra, Mandeep R; Uber, Patricia A

    2002-01-01

    A large number of chronic heart failure patients escape from the benefits of neurohormonal blockade only to transit into a discouragingly miserable state of what the physician often refers to as end-stage heart failure. Conceptually, the designation of end-stage as a description of a clinical scenario implies pessimism concerning recourse to a therapeutic avenue. A variety of surgical therapeutic techniques that take advantage of the law of Laplace, designed to effectively restore the cardiac shape from a spherical, mechanically inefficient pump to a more elliptical, structurally sound organ are now being employed. Additionally, the field of mechanical device implantation is surging ahead at a rapid pace. The weight of evidence regarding mechanical unloading using assist devices suggests that hemodynamic restoration is accompanied by regression of cellular hypertrophy, normalization of the neuroendocrine axis, improved expression of contractile proteins, enhanced cellular respiratory control, and decreases in markers of apoptosis and cellular stress. Thus, these lines of data point toward discarding the notion of end-stage heart failure. We are at a new crossroad in our quest to tackle chronic heart failure. It is our contention that the use of antiremodeling strategies, including device approaches, will soon signal the end of end-stage heart failure.

  3. CUSat: An End-to-End In-Orbit Inspection System University Nanosatellite Program

    DTIC Science & Technology

    2007-01-01

    the ROPs . 15 Camera Interface Board (CAM IB) The original Camera Interface Board (CAM IB) design used four boards: * The Heron FPGA5 board This board...phase differential GPS enables CUSat to navigate and use its cameras to gather target-satellite imagery. In the ground segment, image-processing...to navigate and use its cameras to gather target-satellite imagery. In the Ground Segment, image-processing techniques verify the CDGPS relative

  4. End-to-end laser radar range code for coherent cw lasers

    NASA Astrophysics Data System (ADS)

    Yoder, M. John; Seliverstov, Dima

    1996-06-01

    A user friendly modular computer code is described for CW coherent laser radar which includes all relevant physical effects needed to evaluate the probability of detection versus time after launch for ballistic missiles or other targets of interest. The beginning point of the code is the conventional laser radar range equation. Atmospheric attenuation is determined from an integral FASCODE calculation, and the laser radar range equation is solved for a curved-earth geometry including free air turbulence induced beam spreading. Several different atmospheric turbulence models are selectable. Target cross-sections can be input into the code as a function of aspect angle Coherence time and transverse coherence length limits are included in the code. Beam jitter effects are also calculated. The carrier-to-noise ratio is calculated including all of these (complicated) variables and degradations. The code then calculates the probability of detection of the target as a function of time using incoherent integration of coherent sub-pulses. The governing equations and practical results are presented for detection and tracking of long range theater ballistic missiles from airborne surveillance platforms. The use of CW lasers requires increased measurement times compared to pulsed lasers and results in an averaging of the target fading statistics.

  5. Improving End-To-End Tsunami Warning for Risk Reduction on Canada’s West Coast

    DTIC Science & Technology

    2015-01-01

    several disciplines including studied tsunami deposit layers, radiocarbon dating on terestrial plant specimens and tree-ring analysis of Sitka spruce...West Coast (CSSP-2013-TI-1033) i   Record of Amendments Version Number Section Amended Entered By Amendment Date Version 1 Document created...of the Oregon–Washington margin. Tectonics, 9 (1990), pp. 569–583. Atwater, B.F., Stuiver, M.,Yamaguchi, D.K. (1990). Radiocarbon test of earthquake

  6. Assessing the Performance Limits of Internal Coronagraphs Through End-to-End Modeling

    NASA Technical Reports Server (NTRS)

    Krist, John E.; Belikov, Ruslan; Pueyo, Laurent; Mawet, Dimitri P.; Moody, Dwight; Trauger, John T.; Shaklan, Stuart B.

    2013-01-01

    As part of the NASA ROSES Technology Demonstrations for Exoplanet Missions (TDEM) program, we conducted a numerical modeling study of three internal coronagraphs (PIAA, vector vortex, hybrid bandlimited) to understand their behaviors in realistically-aberrated systems with wavefront control (deformable mirrors). This investigation consisted of two milestones: (1) develop wavefront propagation codes appropriate for each coronagraph that are accurate to 1% or better (compared to a reference algorithm) but are also time and memory efficient, and (2) use these codes to determine the wavefront control limits of each architecture. We discuss here how the milestones were met and identify some of the behaviors particular to each coronagraph. The codes developed in this study are being made available for community use. We discuss here results for the HBLC and VVC systems, with PIAA having been discussed in a previous proceeding.

  7. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Christopher J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Wilkinson, Timothy S.

    2010-06-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  8. End-to-End Concurrent Multipath Transfer Using Transport Layer Multihoming

    DTIC Science & Technology

    2006-07-01

    insight into the ambient conditions under which cwnd overgrowth can be observed with SCTP, we develop an analytical model of this behavior and analyze...example in Section 6.2. The goal of this model is to provide insight into the ambient conditions un- der which cwnd overgrowth can be observed, thus...to con- gestion control. While some initial work in the area demonstrates feasibility [53], further work is needed to determine how these techniques

  9. An end-to-end assessment of extreme weather impacts on food security

    NASA Astrophysics Data System (ADS)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  10. Assessing Natural Product-Drug Interactions: An End-to-End Safety Framework.

    PubMed

    Roe, Amy L; Paine, Mary F; Gurley, Bill J; Brouwer, Kenneth R; Jordan, Scott; Griffiths, James C

    2016-04-01

    The use of natural products (NPs), including herbal medicines and other dietary supplements, by North Americans continues to increase across all age groups. This population has access to conventional medications, with significant polypharmacy observed in older adults. Thus, the safety of the interactions between multi-ingredient NPs and drugs is a topic of paramount importance. Considerations such as history of safe use, literature data from animal toxicity and human clinical studies, and NP constituent characterization would provide guidance on whether to assess NP-drug interactions experimentally. The literature is replete with reports of various NP extracts and constituents as potent inhibitors of drug metabolizing enzymes, and transporters. However, without standard methods for NP characterization or in vitro testing, extrapolating these reports to clinically-relevant NP-drug interactions is difficult. This lack of a clear definition of risk precludes clinicians and consumers from making informed decisions about the safety of taking NPs with conventional medications. A framework is needed that describes an integrated robust approach for assessing NP-drug interactions; and, translation of the data into formulation alterations, dose adjustment, labelling, and/or post-marketing surveillance strategies. A session was held at the 41st Annual Summer Meeting of the Toxicology Forum in Colorado Springs, CO, to highlight the challenges and critical components that should be included in a framework approach.

  11. Integrated Information and Network Management for End-to-End Quality of Service

    DTIC Science & Technology

    2011-11-01

    the actual capacity of the network is unknown. D. Explicit Channel Reservation Explicit Channel Reservation ( ECR ) provides guaranteed bandwidth...for a stream of important packets. ECR works by establishing reservations for a stream of traffic. Although DiffServ is more prevalent, protocols...layer libraries for interacting with the lower-layer features of JCAN. JCAN provides implementations for Mobile IP, ECR WFQ, and CMR. The key

  12. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  13. Non-adaptive End-to-End Diagnosis for Complex Networks

    DTIC Science & Technology

    2015-09-30

    theoretical results by numerical simulations. All three goals have been essentially met. More concretely , for goal number 1, we have shown the lower bound...measurements. Finally, we would like to test these theoretical results by numerical simulations. All three goals have been essentially met. More concretely ...these theoretical results by numerical simulations. All three goals have been essentially met. More concretely , for goal number 1, we have shown the

  14. End-to-end design consideration of a radar altimeter for terrain-aided navigation

    NASA Astrophysics Data System (ADS)

    Chun, Joohwan; Choi, Sanghyouk; Paek, Inchan; Park, Dongmin; Yoo, Kyungju

    2013-10-01

    We present a preliminary simulation study of an interferometric SAR altimeter for the terrain-aided navigation application. Our simulation includes raw SAR data generation, azimuth compression, leading edge detection of the echo signal, maximum likelihood angle estimation and the Bayesian state estimation. Sour results show that radar altimeter performance can be improved with the feedback loop from the rear-end navigation part.

  15. HIDE & SEEK: End-to-end packages to simulate and process radio survey data

    NASA Astrophysics Data System (ADS)

    Akeret, J.; Seehars, S.; Chang, C.; Monstein, C.; Amara, A.; Refregier, A.

    2017-01-01

    As several large single-dish radio surveys begin operation within the coming decade, a wealth of radio data will become available and provide a new window to the Universe. In order to fully exploit the potential of these datasets, it is important to understand the systematic effects associated with the instrument and the analysis pipeline. A common approach to tackle this is to forward-model the entire system-from the hardware to the analysis of the data products. For this purpose, we introduce two newly developed, open-source Python packages: the HI Data Emulator (HIDE) and the Signal Extraction and Emission Kartographer (SEEK) for simulating and processing single-dish radio survey data. HIDE forward-models the process of collecting astronomical radio signals in a single-dish radio telescope instrument and outputs pixel-level time-ordered-data. SEEK processes the time-ordered-data, removes artifacts from Radio Frequency Interference (RFI), automatically applies flux calibration, and aims to recover the astronomical radio signal. The two packages can be used separately or together depending on the application. Their modular and flexible nature allows easy adaptation to other instruments and datasets. We describe the basic architecture of the two packages and examine in detail the noise and RFI modeling in HIDE, as well as the implementation of gain calibration and RFI mitigation in SEEK. We then apply HIDE &SEEK to forward-model a Galactic survey in the frequency range 990-1260 MHz based on data taken at the Bleien Observatory. For this survey, we expect to cover 70% of the full sky and achieve a median signal-to-noise ratio of approximately 5-6 in the cleanest channels including systematic uncertainties. However, we also point out the potential challenges of high RFI contamination and baseline removal when examining the early data from the Bleien Observatory. The fully documented HIDE &SEEK packages are available at http://hideseek.phys.ethz.ch/ and are published under the GPLv3 license on GitHub.

  16. End-to-End Modeling with the Heimdall Code to Scope High-Power Microwave Systems

    DTIC Science & Technology

    2007-06-01

    which is modeled as a Thevenin-equivalent voltage source, with open-circuit voltage VPP and output impedance ZPP . We model the RF Source electrically...chosen device radius and VPP and ZPP . As a final example, our treatment of the tunable relativistic magnetron uses a piecewise-linear model for...the left and the RF Source to the right, with VPP and ZPP the open-circuit voltage and output impedance of the Pulsed Power and VRF and IRF the

  17. Topological Constraints on Identifying Additive Link Metrics via End-to-end Paths Measurements

    DTIC Science & Technology

    2012-09-20

    U.K. Ministry of Defence or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government...MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION /AVAILABILITY STATEMENT Approved for public release; distribution unlimited 13. SUPPLEMENTARY NOTES...with (partially) unknown probabil- ity distributions , and apply various parametric/nonparametric techniques to estimate the link metric distributions

  18. Post2 End-to-End Descent and Landing Simulation for ALHAT Design Analysis Cycle 2

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Johnson, Andrew E.; Paschall, Stephen C., II

    2010-01-01

    The ALHAT project is an agency-level program involving NASA centers, academia, and industry, with a primary goal to develop a safe, autonomous, precision-landing system for robotic and crew-piloted lunar and planetary descent vehicles. POST2 is used as the 6DOF descent and landing trajectory simulation for determining integrated system performance of ALHAT landing-system models and lunar environment models. This paper presents updates in the development of the ALHAT POST2 simulation, as well as preliminary system performance analysis for ALDAC-2 used for the testing and assessment of ALHAT system models. The ALDAC-2 POST2 Monte Carlo simulation results have been generated and focus on HRN model performance with the fully integrated system, as well performance improvements of AGNC and TSAR model since the previous design analysis cycle

  19. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  20. Integrating end-to-end encryption and authentication technology into broadband networks

    SciTech Connect

    Pierson, L.G.

    1995-11-01

    BISDN services will involve the integration of high speed data, voice, and video functionality delivered via technology similar to Asynchronous Transfer Mode (ATM) switching and SONET optical transmission systems. Customers of BISDN services may need a variety of data authenticity and privacy assurances, via Asynchronous Transfer Mode (ATM) services Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale for implementation at high speed. The incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. While there are many design issues associated with the serving of public keys for authenticated signaling and for establishment of session cryptovariables, this paper is concerned with the impact of encryption itself on such communications once the signaling and setup have been completed. Network security protections should be carefully matched to the threats against which protection is desired. Even after eliminating unnecessary protections, the remaining customer-required network security protections can impose severe performance penalties. These penalties (further discussed below) usually involve increased communication processing for authentication or encryption, increased error rate, increased communication delay, and decreased reliability/availability. Protection measures involving encryption should be carefully engineered so as to impose the least performance, reliability, and functionality penalties, while achieving the required security protection. To study these trade-offs, a prototype encryptor/decryptor was developed. This effort demonstrated the viability of implementing certain encryption techniques in high speed networks. The research prototype processes ATM cells in a SONET OC-3 payload. This paper describes the functionality, reliability, security, and performance design trade-offs investigated with the prototype.

  1. Stock assessment and end-to-end ecosystem models alter dynamics of fisheries data.

    PubMed

    Storch, Laura S; Glaser, Sarah M; Ye, Hao; Rosenberg, Andrew A

    2017-01-01

    Although all models are simplified approximations of reality, they remain useful tools for understanding, predicting, and managing populations and ecosystems. However, a model's utility is contingent on its suitability for a given task. Here, we examine two model types: single-species fishery stock assessment and multispecies marine ecosystem models. Both are efforts to predict trajectories of populations and ecosystems to inform fisheries management and conceptual understanding. However, many of these ecosystems exhibit nonlinear dynamics, which may not be represented in the models. As a result, model outputs may underestimate variability and overestimate stability. Using nonlinear forecasting methods, we compare predictability and nonlinearity of model outputs against model inputs using data and models for the California Current System. Compared with model inputs, time series of model-processed outputs show more predictability but a higher prevalence of linearity, suggesting that the models misrepresent the actual predictability of the modeled systems. Thus, caution is warranted: using such models for management or scenario exploration may produce unforeseen consequences, especially in the context of unknown future impacts.

  2. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems

    NASA Astrophysics Data System (ADS)

    Cota, Steve A.; Bell, Jabin T.; Boucher, Richard H.; Dutton, Tracy E.; Florio, Chris J.; Franz, Geoffrey A.; Grycewicz, Thomas J.; Kalman, Linda S.; Keller, Robert A.; Lomheim, Terrence S.; Paulson, Diane B.; Willkinson, Timothy S.

    2008-08-01

    The design of any modern imaging system is the end result of many trade studies, each seeking to optimize image quality within real world constraints such as cost, schedule and overall risk. Image chain analysis - the prediction of image quality from fundamental design parameters - is an important part of this design process. At The Aerospace Corporation we have been using a variety of image chain analysis tools for many years, the Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) among them. In this paper we describe our PICASSO tool, showing how, starting with a high quality input image and hypothetical design descriptions representative of the current state of the art in commercial imaging satellites, PICASSO can generate standard metrics of image quality in support of the decision processes of designers and program managers alike.

  3. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing

    PubMed Central

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-01-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient’s genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion: an attacker, given the model and some demographic information about a patient, can predict the patient’s genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected. We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality. We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work. PMID:27077138

  4. End-to-End Network QoS via Scheduling of Flexible Resource Reservation Requests

    SciTech Connect

    Sharma, S.; Katramatos, D.; Yu, D.

    2011-11-14

    Modern data-intensive applications move vast amounts of data between multiple locations around the world. To enable predictable and reliable data transfer, next generation networks allow such applications to reserve network resources for exclusive use. In this paper, we solve an important problem (called SMR3) to accommodate multiple and concurrent network reservation requests between a pair of end-sites. Given the varying availability of bandwidth within the network, our goal is to accommodate as many reservation requests as possible while minimizing the total time needed to complete the data transfers. We first prove that SMR3 is an NP-hard problem. Then we solve it by developing a polynomial-time heuristic, called RRA. The RRA algorithm hinges on an efficient mechanism to accommodate large number of requests by minimizing the bandwidth wastage. Finally, via numerical results, we show that RRA constructs schedules that accommodate significantly larger number of requests compared to other, seemingly efficient, heuristics.

  5. EQUIP: end-to-end quantification of uncertainty for impacts prediction

    NASA Astrophysics Data System (ADS)

    Morse, A. P.; Challinor, A. J.; Equip Consortium

    2010-12-01

    Inherent uncertainties in climate prediction present a serious challenge to attempts to assess future impacts and adaptation options. Such assessments are critical to any policy decisions regarding investment in resources to ensure human and environmental wellbeing in the face of environmental change and a growing population. Current methods for quantifying uncertainty in projections of climate and its impacts tend to focus first on taking full account of uncertainty, with a subsequent step assessing utility. We argue that a new approach is required, whereby climate and impacts models are used to develop risk-based prediction systems that focus on the information content of models and utility for decision-making. Preliminary steps in this direction are explored, principally using the example of climate-induced changes in crop yield. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. A focus on governing bio-physical processes across a number of crop models is used to characterise the robustness of the results. Further development of this approach relies on the development of decision-focussed techniques that analyse sources of uncertainty and assess and improve the information content of models of climate and its impacts. Such an approach is significantly different from tagging impacts models onto climate models. It implies substantial interaction with other organisations and stakeholders from development NGOs to the insurance sector and policy makers. These interactions should be aimed at ensuring that the principal lead-times, and formats, for the impact projections are those relevant to decision-making. The EQUIP project, and its associated open network of scientists, aims to develop the approach outlined above. The project is examining the cascade of uncertainty from climate to impacts by conducting integrated analyses of a range of sectors, principally crops, marine ecosystems, water management, heat waves and droughts. The research includes assessment of the information content of climate model projections, combination of climate models and data-driven models to support decisions, and evaluation of the quality of climate and impacts predictions.

  6. From End to End: tRNA Editing at 5'- and 3'-Terminal Positions

    PubMed Central

    Betat, Heike; Long, Yicheng; Jackman, Jane E.; Mörl, Mario

    2014-01-01

    During maturation, tRNA molecules undergo a series of individual processing steps, ranging from exo- and endonucleolytic trimming reactions at their 5'- and 3'-ends, specific base modifications and intron removal to the addition of the conserved 3'-terminal CCA sequence. Especially in mitochondria, this plethora of processing steps is completed by various editing events, where base identities at internal positions are changed and/or nucleotides at 5'- and 3'-ends are replaced or incorporated. In this review, we will focus predominantly on the latter reactions, where a growing number of cases indicate that these editing events represent a rather frequent and widespread phenomenon. While the mechanistic basis for 5'- and 3'-end editing differs dramatically, both reactions represent an absolute requirement for generating a functional tRNA. Current in vivo and in vitro model systems support a scenario in which these highly specific maturation reactions might have evolved out of ancient promiscuous RNA polymerization or quality control systems. PMID:25535083

  7. The Use of End-to-End Multicast Measurements for Characterizing Internal Network Behavior

    DTIC Science & Technology

    2002-08-01

    and fSigmi a collection of subsets of R. (i) U niUSi if and only if the equations f Si DSixg m i have a unique solution x. (ii) Assume P... Si identifies Si for each i. Then fPSig m i identifies iff either (and hence both) of the conditions of part (i) are satisfied. Remarks...terminates some segment KSik and hence xk Si k P kjfSik xj . By the maximality assumption, all terms on the RHS are unique, and hence so is xk

  8. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return missions, and approaching spacecraft in the vicinity of Mars, to demostrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out its own science investigations.

  9. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Bridenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return, missions, and approaching spacecraft in the vicinity of Mars, to demonstrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out is own science investigations.

  10. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  11. End-to-end Encryption for SMS Messages in the Health Care Domain.

    PubMed

    Hassinen, Marko; Laitinen, Pertti

    2005-01-01

    The health care domain has a high level of expectation on security and privacy of patient information. The security, privacy, and confidentiality issues are consistent all over the domain. Technical development and increasing use of mobile phones has led us to a situation in which SMS messages are used in the electronic interactions between health care professionals and patients. We will show that it is possible to send, receive and store text messages securely with a mobile phone with no additional hardware required. More importantly we will show that it is possible to obtain a reliable user authentication in systems using text message communication. Programming language Java is used for realization of our goals. This paper describes the general application structure, while details for the technical implementation and encryption methods are described in the referenced articles. We also propose some crucial areas where the implementation of encrypted SMS can solve previous lack of security.

  12. Science and Applications Space Platform (SASP) End-to-End Data System Study

    NASA Technical Reports Server (NTRS)

    Crawford, P. R.; Kasulka, L. H.

    1981-01-01

    The capability of present technology and the Tracking and Data Relay Satellite System (TDRSS) to accommodate Science and Applications Space Platforms (SASP) payload user's requirements, maximum service to the user through optimization of the SASP Onboard Command and Data Management System, and the ability and availability of new technology to accommodate the evolution of SASP payloads were assessed. Key technology items identified to accommodate payloads on a SASP were onboard storage devices, multiplexers, and onboard data processors. The primary driver is the limited access to TDRSS for single access channels due to sharing with all the low Earth orbit spacecraft plus shuttle. Advantages of onboard data processing include long term storage of processed data until TRDSS is accessible, thus reducing the loss of data, eliminating large data processing tasks at the ground stations, and providing a more timely access to the data.

  13. Design and Evaluation for the End-to-End Detection of TCP/IP Header Manipulation

    DTIC Science & Technology

    2014-06-01

    86 7.3 HICCUPS Details . . . . . . . . . . . . . . . . . . . . . . . . 89 7.4 Testing in a Controlled Environment...based Integrity Check of Critical Underlying Protocol Semantics HTTP Hypertext Transfer Protocol ICMP Internet Control Message Protocol ICSI...Enforcement SSL Secure Sockets Layer SYN synchronize TCP Transmission Control Protocol TCP-AO TCP Authentication Option TFO TCP Fast Open TLS Transport Layer

  14. Eulerian BAO reconstructions and N -point statistics

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel; Feng, Yu; Beutler, Florian; Sherwin, Blake; Chu, Man Yat

    2015-12-01

    As galaxy surveys begin to measure the imprint of baryonic acoustic oscillations (BAO) on large-scale structure at the subpercent level, reconstruction techniques that reduce the contamination from nonlinear clustering become increasingly important. Inverting the nonlinear continuity equation, we propose an Eulerian growth-shift reconstruction algorithm that does not require the displacement of any objects, which is needed for the standard Lagrangian BAO reconstruction algorithm. In real-space dark matter-only simulations the algorithm yields 95% of the BAO signal-to-noise obtained from standard reconstruction. The reconstructed power spectrum is obtained by adding specific simple 3- and 4-point statistics to the prereconstruction power spectrum, making it very transparent how additional BAO information from higher-point statistics is included in the power spectrum through the reconstruction process. Analytical models of the reconstructed density for the two algorithms agree at second order. Based on similar modeling efforts, we introduce four additional reconstruction algorithms and discuss their performance.

  15. Multi time-step wavefront reconstruction for tomographic adaptive-optics systems.

    PubMed

    Ono, Yoshito H; Akiyama, Masayuki; Oya, Shin; Lardiére, Olivier; Andersen, David R; Correia, Carlos; Jackson, Kate; Bradley, Colin

    2016-04-01

    In tomographic adaptive-optics (AO) systems, errors due to tomographic wavefront reconstruction limit the performance and angular size of the scientific field of view (FoV), where AO correction is effective. We propose a multi time-step tomographic wavefront reconstruction method to reduce the tomographic error by using measurements from both the current and previous time steps simultaneously. We further outline the method to feed the reconstructor with both wind speed and direction of each turbulence layer. An end-to-end numerical simulation, assuming a multi-object AO (MOAO) system on a 30 m aperture telescope, shows that the multi time-step reconstruction increases the Strehl ratio (SR) over a scientific FoV of 10 arc min in diameter by a factor of 1.5-1.8 when compared to the classical tomographic reconstructor, depending on the guide star asterism and with perfect knowledge of wind speeds and directions. We also evaluate the multi time-step reconstruction method and the wind estimation method on the RAVEN demonstrator under laboratory setting conditions. The wind speeds and directions at multiple atmospheric layers are measured successfully in the laboratory experiment by our wind estimation method with errors below 2  ms-1. With these wind estimates, the multi time-step reconstructor increases the SR value by a factor of 1.2-1.5, which is consistent with a prediction from the end-to-end numerical simulation.

  16. Venous anastomosis in free flap reconstruction after radical neck dissection: is the external jugular vein a feasible option?

    PubMed

    Reiter, Maximilian; Baumeister, Philipp

    2017-01-13

    Free microvascular tissue transfer has become a reliable and wellestablished technique in reconstructive surgery. Success rates greater than 95% are constantly reported in the literature. End-to-end anastomosis to the external jugular vein (EJ) is supposed to be equally successful as anastomosis to the internal jugular vein (IJ) in patients treated with selective neck dissection. No data has been published so far when the IJ had to be resected during neck dissection. The purpose of this study was to evaluate the success rate and complications of end-to-end anastomosis to the EJ in cases of (modified) radical neck dissection with resected IJ. A retrospective mono-center cohort study was performed. All patients with end-to-end anastomosis to either the IJ or EJ-system were reviewed. 423 free-tissue transfers performed between 2009 and 2016 were included. The overall success rate was 97.0% with an anastomotic revision rate due to venous thrombosis of 12.3%. In patients when the IJ had to be resected and the venous anastomosis was performed at the ipsilateral side to the EJ (n = 53), overall flap loss was significantly higher (5/53; 9.4%). The revision rate in these cases was 22.6%. Success rate of anastomosis to the EJ when the ipsilateral IJ was still intact was 100% (n = 20). Success rate when the anastomosis was performed at the contralateral side was 100%. End-to-end anastomosis to the EJ in cases with resected IJ is more likely to result in free flap loss. Furthermore, it is associated with a higher revision rate. Therefore, in cases with resected IJ, we suggest to plan the operation beforehand with anastomosis at the contralateral side whenever possible.

  17. Success of free flap anastomoses performed within the zone of trauma in acute lower limb reconstruction.

    PubMed

    Bendon, Charlotte L; Giele, Henk P

    2016-07-01

    Traditionally, in free flap cover of lower limb injuries, every attempt is made to perform anastomoses proximal to the zone of injury. We report on the success of anastomoses within the zone of trauma, at the level of the fracture, avoiding further dissection and exposure. The records of free flap reconstructions for fractures of the lower extremity at a tertiary trauma centre between 2004 and 2010 were retrospectively reviewed. A total of 48 lower limb fractures required free flap reconstruction, performed at 28 days post injury (0-275 days). Anastomoses were proximal (21), distal (5) or within the zone of trauma (22). There was no significant difference (p > 0.05) in return to theatre, revision of anastomosis or flap survival between groups. Of the 22 performed within the zone of injury, five returned to theatre but only two for revision of anastomosis and 20 (91%) of these flaps survived. Of the 48 free flaps, arterial anastomoses were end to end in 34 (71%) and end to side in 14 (30%). There was no significant difference (p > 0.05) in return to theatre, revision of anastomosis or flap survival between the end-to-end and end-to-side groups. There was a tendency for arterial anastomoses to be performed end to end outside the zone of trauma (23/26) compared to within the zone of trauma (11/22). Our data suggest that free flap anastomoses can be performed safely in the zone of trauma in lower limb injuries.

  18. Wavefront reconstruction for extremely large telescopes via CuRe with domain decomposition.

    PubMed

    Rosensteiner, Matthias

    2012-11-01

    The Cumulative Reconstructor is an accurate, extremely fast reconstruction algorithm for Shack-Hartmann wavefront sensor data. But it has shown an unacceptable high noise propagation for large apertures. Therefore, in this paper we describe a domain decomposition approach to deal with this drawback. We show that this adaptation of the algorithm gives the same reconstruction quality as the original algorithm and leads to a significant improvement with respect to noise propagation. The method is combined with an integral control and compared to the classical matrix vector multiplication algorithm on an end-to-end simulation of a single conjugate adaptive optics system. The reconstruction time is 20n (number of subapertures), and the method is parallelizable.

  19. Comparison of methods for the reduction of reconstructed layers in atmospheric tomography.

    PubMed

    Saxenhuber, Daniela; Auzinger, Günter; Louarn, Miska Le; Helin, Tapio

    2017-04-01

    For the new generation of extremely large telescopes (ELTs), the computational effort for adaptive optics (AO) systems is demanding even for fast reconstruction algorithms. In wide-field AO, atmospheric tomography, i.e., the reconstruction of turbulent atmospheric layers from wavefront sensor data in several directions of view, is the crucial step for an overall reconstruction. Along with the number of deformable mirrors, wavefront sensors and their resolution, as well as the guide star separation, the number of reconstruction layers contributes significantly to the numerical effort. To reduce the computational cost, a sparse reconstruction profile which still yields good reconstruction quality is needed. In this paper, we analyze existing methods and present new approaches to determine optimal layer heights and turbulence weights for the tomographic reconstruction. Two classes of methods are discussed. On the one hand, we have compression methods that downsample a given input profile to fewer layers. Among other methods, a new compression method based on discrete optimization of collecting atmospheric layers to subgroups and the compression by means of conserving turbulence moments is presented. On the other hand, we take a look at a joint optimization of tomographic reconstruction and reconstruction profile during atmospheric tomography, which is independent of any a priori information on the underlying input profile. We analyze and study the qualitative performance of these methods for different input profiles and varying fields of view in an ELT-sized multi-object AO setting on the European Southern Observatory end-to-end simulation tool OCTOPUS.

  20. Penile Reconstruction

    PubMed Central

    Salgado, Christopher J.; Chim, Harvey; Tang, Jennifer C.; Monstrey, Stan J.; Mardini, Samir

    2011-01-01

    A variety of surgical options exists for penile reconstruction. The key to success of therapy is holistic management of the patient, with attention to the psychological aspects of treatment. In this article, we review reconstructive modalities for various types of penile defects inclusive of partial and total defects as well as the buried penis, and also describe recent basic science advances, which may promise new options for penile reconstruction. PMID:22851914

  1. Primordial power spectrum from Planck

    SciTech Connect

    Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun E-mail: arman@apctp.org

    2014-11-01

    Using modified Richardson-Lucy algorithm we reconstruct the primordial power spectrum (PPS) from Planck Cosmic Microwave Background (CMB) temperature anisotropy data. In our analysis we use different combinations of angular power spectra from Planck to reconstruct the shape of the primordial power spectrum and locate possible features. Performing an extensive error analysis we found the dip near ℓ ∼ 750–850 represents the most prominent feature in the data. Feature near ℓ ∼ 1800–2000 is detectable with high confidence only in 217 GHz spectrum and is apparently consequence of a small systematic as described in the revised Planck 2013 papers. Fixing the background cosmological parameters and the foreground nuisance parameters to their best fit baseline values, we report that the best fit power law primordial power spectrum is consistent with the reconstructed form of the PPS at 2σ C.L. of the estimated errors (apart from the local features mentioned above). As a consistency test, we found the reconstructed primordial power spectrum from Planck temperature data can also substantially improve the fit to WMAP-9 angular power spectrum data (with respect to power-law form of the PPS) allowing an overall amplitude shift of ∼ 2.5%. In this context low-ℓ and 100 GHz spectrum from Planck which have proper overlap in the multipole range with WMAP data found to be completely consistent with WMAP-9 (allowing amplitude shift). As another important result of our analysis we do report the evidence of gravitational lensing through the reconstruction analysis. Finally we present two smooth form of the PPS containing only the important features. These smooth forms of PPS can provide significant improvements in fitting the data (with respect to the power law PPS) and can be helpful to give hints for inflationary model building.

  2. Primordial power spectrum from Planck

    NASA Astrophysics Data System (ADS)

    Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun

    2014-11-01

    Using modified Richardson-Lucy algorithm we reconstruct the primordial power spectrum (PPS) from Planck Cosmic Microwave Background (CMB) temperature anisotropy data. In our analysis we use different combinations of angular power spectra from Planck to reconstruct the shape of the primordial power spectrum and locate possible features. Performing an extensive error analysis we found the dip near l ~ 750-850 represents the most prominent feature in the data. Feature near l ~ 1800-2000 is detectable with high confidence only in 217 GHz spectrum and is apparently consequence of a small systematic as described in the revised Planck 2013 papers. Fixing the background cosmological parameters and the foreground nuisance parameters to their best fit baseline values, we report that the best fit power law primordial power spectrum is consistent with the reconstructed form of the PPS at 2σ C.L. of the estimated errors (apart from the local features mentioned above). As a consistency test, we found the reconstructed primordial power spectrum from Planck temperature data can also substantially improve the fit to WMAP-9 angular power spectrum data (with respect to power-law form of the PPS) allowing an overall amplitude shift of ~ 2.5%. In this context low-l and 100 GHz spectrum from Planck which have proper overlap in the multipole range with WMAP data found to be completely consistent with WMAP-9 (allowing amplitude shift). As another important result of our analysis we do report the evidence of gravitational lensing through the reconstruction analysis. Finally we present two smooth form of the PPS containing only the important features. These smooth forms of PPS can provide significant improvements in fitting the data (with respect to the power law PPS) and can be helpful to give hints for inflationary model building.

  3. Zellweger Spectrum

    MedlinePlus

    ... Resources Conference News Contact Us Donate The Zellweger Spectrum Zellweger Syndrome, Neonatal Adrenoleukodystrophy (NALD), and Infantile Refsum’s ... of severity of disease. What causes the Zellweger spectrum of diseases? As we mentioned, disorders of the ...

  4. Penile reconstruction

    PubMed Central

    Garaffa, Giulio; Sansalone, Salvatore; Ralph, David J

    2013-01-01

    During the most recent years, a variety of new techniques of penile reconstruction have been described in the literature. This paper focuses on the most recent advances in male genital reconstruction after trauma, excision of benign and malignant disease, in gender reassignment surgery and aphallia with emphasis on surgical technique, cosmetic and functional outcome. PMID:22426595

  5. Iterative reconstruction methods in atmospheric tomography: FEWHA, Kaczmarz and Gradient-based algorithm

    NASA Astrophysics Data System (ADS)

    Ramlau, R.; Saxenhuber, D.; Yudytskiy, M.

    2014-07-01

    The problem of atmospheric tomography arises in ground-based telescope imaging with adaptive optics (AO), where one aims to compensate in real-time for the rapidly changing optical distortions in the atmosphere. Many of these systems depend on a sufficient reconstruction of the turbulence profiles in order to obtain a good correction. Due to steadily growing telescope sizes, there is a strong increase in the computational load for atmospheric reconstruction with current methods, first and foremost the MVM. In this paper we present and compare three novel iterative reconstruction methods. The first iterative approach is the Finite Element- Wavelet Hybrid Algorithm (FEWHA), which combines wavelet-based techniques and conjugate gradient schemes to efficiently and accurately tackle the problem of atmospheric reconstruction. The method is extremely fast, highly flexible and yields superior quality. Another novel iterative reconstruction algorithm is the three step approach which decouples the problem in the reconstruction of the incoming wavefronts, the reconstruction of the turbulent layers (atmospheric tomography) and the computation of the best mirror correction (fitting step). For the atmospheric tomography problem within the three step approach, the Kaczmarz algorithm and the Gradient-based method have been developed. We present a detailed comparison of our reconstructors both in terms of quality and speed performance in the context of a Multi-Object Adaptive Optics (MOAO) system for the E-ELT setting on OCTOPUS, the ESO end-to-end simulation tool.

  6. Ligament reconstruction.

    PubMed

    Glickel, Steven Z; Gupta, Salil

    2006-05-01

    Volar ligament reconstruction is an effective technique for treating symptomatic laxity of the CMC joint of the thumb. The laxity may bea manifestation of generalized ligament laxity,post-traumatic, or metabolic (Ehler-Danlos). There construction reduces the shear forces on the joint that contribute to the development and persistence of inflammation. Although there have been only a few reports of the results of volar ligament reconstruction, the use of the procedure to treat Stage I and Stage II disease gives good to excellent results consistently. More advanced stages of disease are best treated by trapeziectomy, with or without ligament reconstruction.

  7. Optimal reconstruction for closed-loop ground-layer adaptive optics with elongated spots.

    PubMed

    Béchet, Clémentine; Tallon, Michel; Tallon-Bosc, Isabelle; Thiébaut, Éric; Le Louarn, Miska; Clare, Richard M

    2010-11-01

    The design of the laser-guide-star-based adaptive optics (AO) systems for the Extremely Large Telescopes requires careful study of the issue of elongated spots produced on Shack-Hartmann wavefront sensors. The importance of a correct modeling of the nonuniformity and correlations of the noise induced by this elongation has already been demonstrated for wavefront reconstruction. We report here on the first (to our knowledge) end-to-end simulations of closed-loop ground-layer AO with laser guide stars with such an improved noise model. The results are compared with the level of performance predicted by a classical noise model for the reconstruction. The performance is studied in terms of ensquared energy and confirms that, thanks to the improved noise model, central or side launching of the lasers does not affect the performance with respect to the laser guide stars' flux. These two launching schemes also perform similarly whatever the atmospheric turbulence strength.

  8. ACL reconstruction

    MedlinePlus

    ... This increases the chance you may have a meniscus tear. ACL reconstruction may be used for these ... When other ligaments are also injured When your meniscus is torn Before surgery, talk to your health ...

  9. PSF reconstruction for MUSE in wide field mode

    NASA Astrophysics Data System (ADS)

    Villecroze, R.; Fusco, Thierry; Bacon, Roland; Madec, Pierre-Yves

    2012-07-01

    The resolution of ground-based telescopes is dramatically limited by the atmospheric turbulence.. Adaptative optics (AO) is a real-time opto-mechanical approach which allows to correct for the turbulence effect and to reach the ultimate diffraction limit astronomical telescopes and their associated instrumentation. Nevertheless, the AO correction is never perfect especially when it has to deal with large Field of View (FoV). Hence, a posteriori image processing really improves the final estimation of astrophysical data. Such techniques require an accurate knowledge of the system response at any position in the FoV The purpose of this work is then the estimation of the AO response in the particular case of the MUSE [1] /GALACSI [2] instrument (a 3D mult-object spectrograph combined with a Laser-assisted wide field AO system which will be installed at the VLT in 2013). Using telemetry data coming from both AO Laser and natural guide stars, a Point Spread Function (PSF) is derived at any location of the FoV and for every wavelength of the MUSE spectrograph. This document presents the preliminary design of the MUSE WFM PSF reconstruction process. The various hypothesis and approximations are detailed and justified. A first description of the overall process is proposed. Some alternative strategies to improve the performance (in terms of computation time and storage) are described and have been implemented. Finally, after a validation of the proposed algorithm using end-to-end models, a performance analysis is conducted (with the help of a full end-to-end model). This performance analysis will help us to populate an exhaustive error budget table.

  10. Application of the full spectrum inversion algorithm to simulated airborne GPS radio occultation signals

    NASA Astrophysics Data System (ADS)

    Adhikari, Loknath; Xie, Feiqin; Haase, Jennifer S.

    2016-10-01

    With a GPS receiver on board an airplane, the airborne radio occultation (ARO) technique provides dense lower-tropospheric soundings over target regions. Large variations in water vapor in the troposphere cause strong signal multipath, which could lead to systematic errors in RO retrievals with the geometric optics (GO) method. The spaceborne GPS RO community has successfully developed the full-spectrum inversion (FSI) technique to solve the multipath problem. This paper is the first to adapt the FSI technique to retrieve atmospheric properties (bending and refractivity) from ARO signals, where it is necessary to compensate for the receiver traveling on a non-circular trajectory inside the atmosphere, and its use is demonstrated using an end-to-end simulation system. The forward-simulated GPS L1 (1575.42 MHz) signal amplitude and phase are used to test the modified FSI algorithm. The ARO FSI method is capable of reconstructing the fine vertical structure of the moist lower troposphere in the presence of severe multipath, which otherwise leads to large retrieval errors in the GO retrieval. The sensitivity of the modified FSI-retrieved bending angle and refractivity to errors in signal amplitude and errors in the measured refractivity at the receiver is presented. Accurate bending angle retrievals can be obtained from the surface up to ˜ 250 m below the receiver at typical flight altitudes above the tropopause, above which the retrieved bending angle becomes highly sensitive to the phase measurement noise. Abrupt changes in the signal amplitude that are a challenge for receiver tracking and geometric optics bending angle retrieval techniques do not produce any systematic bias in the FSI retrievals when the SNR is high. For very low SNR, the FSI performs as expected from theoretical considerations. The 1 % in situ refractivity measurement errors at the receiver height can introduce a maximum refractivity retrieval error of 0.5 % (1 K) near the receiver, but

  11. Spectrum Recombination.

    ERIC Educational Resources Information Center

    Greenslade, Thomas B., Jr.

    1984-01-01

    Describes several methods of executing lecture demonstrations involving the recombination of the spectrum. Groups the techniques into two general classes: bringing selected portions of the spectrum together using lenses or mirrors and blurring the colors by rapid movement or foreshortening. (JM)

  12. Tracking ocean wave spectrum from SAR images

    NASA Technical Reports Server (NTRS)

    Goldfinger, A. D.; Beal, R. C.; Monaldo, F. M.; Tilley, D. G.

    1984-01-01

    An end to end algorithm for recovery of ocean wave spectral peaks from Synthetic Aperture Radar (SAR) images is described. Current approaches allow precisions of 1 percent in wave number, and 0.6 deg in direction.

  13. Stem cell-based approaches to improve nerve regeneration: potential implications for reconstructive transplantation?

    PubMed

    Khalifian, Saami; Sarhane, Karim A; Tammia, Markus; Ibrahim, Zuhaib; Mao, Hai-Quan; Cooney, Damon S; Shores, Jaimie T; Lee, W P Andrew; Brandacher, Gerald

    2015-02-01

    Reconstructive transplantation has become a viable option to restore form and function after devastating tissue loss. Functional recovery is a key determinant of overall success and critically depends on the quality and pace of nerve regeneration. Several molecular and cell-based therapies have been postulated and tested in pre-clinical animal models to enhance nerve regeneration. Schwann cells remain the mainstay of research focus providing neurotrophic support and signaling cues for regenerating axons. Alternative cell sources such as mesenchymal stem cells and adipose-derived stromal cells have also been tested in pre-clinical animal models and in clinical trials due to their relative ease of harvest, rapid expansion in vitro, minimal immunogenicity, and capacity to integrate and survive within host tissues, thereby overcoming many of the challenges faced by culturing of human Schwann cells and nerve allografting. Induced pluripotent stem cell-derived Schwann cells are of particular interest since they can provide abundant, patient-specific autologous Schwann cells. The majority of experimental evidence on cell-based therapies, however, has been generated using stem cell-seeded nerve guides that were developed to enhance nerve regeneration across "gaps" in neural repair. Although primary end-to-end repair is the preferred method of neurorrhaphy in reconstructive transplantation, mechanistic studies elucidating the principles of cell-based therapies from nerve guidance conduits will form the foundation of further research employing stem cells in end-to-end repair of donor and recipient nerves. This review presents key components of nerve regeneration in reconstructive transplantation and highlights the pre-clinical studies that utilize stem cells to enhance nerve regeneration.

  14. Project Reconstruct.

    ERIC Educational Resources Information Center

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  15. Parameterizations of truncated food web models from the perspective of an end-to-end model approach

    NASA Astrophysics Data System (ADS)

    Fennel, Wolfgang

    2009-02-01

    Modeling of marine ecosystems is broadly divided into two branches: biogeochemical processes and fish production. The biogeochemical models see the fish only implicitly by mortality rates, while fish production models see the lower food web basically through prescribed food, e.g., copepod biomass. The skill assessment of ecological models, which are usually truncated biogeochemical models, also involves the question of how the effects of the missing higher food web are parameterized. This paper contributes to the goal of bridging biogeochemical models and fish-production models by employing a recently developed coupled NPZDF-model, Fennel [Fennel, W., 2007. Towards bridging biogeochemical and fish production models. Journal of Marine Systems, doi:10.1016/j.jmarsys.2007.06.008]. Here we study parameterizations of truncated NPZD-models from the viewpoint of a complete model. The effects of the higher food web on the cycling of the state variables in a truncated NPZD-model cannot be unambiguously imitated. For example, one can mimic effects of fishery by export fluxes of one of the state variables. It is shown that the mass fluxes between the lower and upper part of the full model food web are significantly smaller than the fluxes within the NPZD-model. However, over longer time scales, relatively small changes can accumulate and eventually become important.

  16. Top to Bottom and End to End. Improving the National Security Agency’s Strategic Decision Processes

    DTIC Science & Technology

    2005-01-01

    synchronize, and prioritize strategic and business planning , requirements, programming, acquisition, and 8 Improving the National Security Agency’s...Chapter Two, an overview of the corporate-level strategic decision processes; Chapter Three, the CRG; Chapter Four, strategic and business planning ; Chapter...a discussion of the Capabilities Generation Process.) The DC4 appointed members of his stafi to manage the strate- gic and business planning activities

  17. End-To-End Solution for Integrated Workload and Data Management using GlideinWMS and Globus Online

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Miller, Zachary; Kettimuthu, Rajkumar; Garzoglio, Gabriele; Holzman, Burt; Weiss, Cathrin; Duan, Xi; Lacinski, Lukasz

    2012-12-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the GlideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates an on-demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Peta-scale Science (CEDPS) by integrating GlideinWMS with Globus Online (GO). Globus Online is a fast, reliable file transfer service that makes it easy for any user to move data. The solution eliminates the need for the users to provide custom data transfer solutions in the application by making this functionality part of the GlideinWMS infrastructure. To achieve this, GlideinWMS uses the file transfer plug-in architecture of Condor. The paper describes the system architecture and how this solution can be extended to support data transfer services other than Globus Online when used with Condor or GlideinWMS.

  18. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    NASA Astrophysics Data System (ADS)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  19. End-to-end 9-D polarized bunch transport in eRHIC energy-recovery recirculator, some aspects

    SciTech Connect

    Meot, F.; Meot, F.; Brooks, S.; Ptitsyn, V.; Trbojevic, D.; Tsoupas, N.

    2015-05-03

    This paper is a brief overview of some of the numerous beam and spin dynamics investigations undertaken in the framework of the design of the FFAG based electron energy recovery re-circulator ring of the eRHIC electron-ion collider project

  20. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  1. End-to-end Simulations of the Performance of the Whipple Survey of the Outer Solar System

    NASA Astrophysics Data System (ADS)

    Schlichting, Hilke; Whipple Science Team

    2010-10-01

    The proposed Whipple mission will detect occultations of bright stars by many Kuiper Belt Objects, Sedna-like Objects and Oort Cloud Objects. This census will be used to address a large number of questions regarding the physical and dynamical properties of the various small body populations of the Solar System. These data will help elucidate the process of formation of macroscopic bodies in the primitive solar system, the history of giant planet migration, and the interactions of planet scattering with the local stellar environment that led to the population of the Oort Cloud, and possibly during the first few million years, of the Sedna region. We have developed a series of tools that simulate stellar populations, occultation light curves (including noise), onboard detection and our fitting routines. We describe all the routines and the simulation pipeline as well as details of the detection algorithms, including statistical arguments. Results of expected rates for different model populations are also presented, demonstrating the anticipated performance and event yield for the mission. These simulations show how Whipple will measure size distributions as a function of (three dimensional) position for these populations.

  2. Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.

    1998-01-01

    The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.

  3. Retention of local conformational compactness in unfolding of barnase; Contribution of end-to-end interactions within quasi-modules.

    PubMed

    Shinoda, Kazuki; Takahashi, Ken-Ichi; Go, Mitiko

    2007-01-01

    To understand how protein reduces the conformational space to be searched for the native structure, it is crucial to characterize ensembles of conformations on the way of folding processes, in particular ensembles of relatively long-range structures connecting between an extensively unfolded state and a state with a native-like overall chain topology. To analyze such intermediate conformations, we performed multiple unfolding molecular dynamics simulations of barnase at 498K. Some short-range structures such as part of helix and turn were well sustained while most of the secondary structures and the hydrophobic cores were eventually lost, which is consistent with the results by other experimental and computational studies. The most important novel findings were persistence of long-range relatively compact substructures, which was captured by exploiting the concept of module. Module is originally introduced to describe the hierarchical structure of a globular protein in the native state. Modules are conceptually such relatively compact substructures that are resulted from partitioning the native structure of a globular protein completely into several contiguous segments with the least extended conformations. We applied this concept of module to detect a possible hierarchical structure of each snapshot structure in unfolding processes as well. Along with this conceptual extension, such detected relatively compact substructures are named quasi-modules. We found almost perfect persistence of quasi-module boundaries that are positioned close to the native module boundaries throughout the unfolding trajectories. Relatively compact conformations of the quasi-modules seemed to be retained mainly by hydrophobic interactions formed between residues located at both terminal regions within each module. From these results, we propose a hypothesis that hierarchical folding with the early formation of quasi-modules effectively reduces search space for the native structure.

  4. Rethinking the Design of the Internet: The End-to-End Arguments vs. the Brave New World

    DTIC Science & Technology

    2001-08-01

    the millennium.” iMP Magazine, Sept. ,http://www.cisp.org/ imp/september_99/09_99blumenthal.htm.. (67) The popular fictional character Harry Potter received...keeps its brain.” Rowling, J.K., 1998. Harry Potter and the Chamber of Secrets, Bloomsbury, p. 242. (68) Pomfret, J., 2000. “China puts clamps on

  5. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  6. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    SciTech Connect

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  7. Building the tree of life from scratch: an end-to-end work flow for phylogenomic studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Whole genome sequences are rich sources of information about organisms that are superbly useful for addressing a wide variety of evolutionary questions. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understan...

  8. Mechanistic study of hemicucurbit[6]uril formation by step-growth oligomerization and end-to-end cyclization

    NASA Astrophysics Data System (ADS)

    Yoo, In Kee; Kang, Young Kee

    2017-02-01

    The formation of hemicucurbit[6]uril (hCB[6]) from ethyleneurea with formaldehyde in acidic aqueous solution was explored using density functional methods and the implicit solvation model in water. The oligomerization and cyclization barriers were approximately half lower than that of the iminium formation. Thus, the initial iminium formation is the rate-determining step, and the formation of hCB[6] is kinetically and thermodynamically favored in acidic aqueous solution. In particular, the 'alternate' conformation of hCB[6] is enthalpically and entropically preferred over the 'cone' conformation, which is consistent with the crystal structure of hCB[6].

  9. ORNL IntelligentFreight Initiative:Enhanced End-to-End Supply Chain Visibility of Security Sensitive Hazardous Materials

    SciTech Connect

    Walker, Randy M.; Shankar, Mallikarjun; Gorman, Bryan L.

    2009-01-01

    In the post September 11, 2001 (9/11) world the federal government has increased its focus on the manufacturing, distributing, warehousing, and transporting of hazardous materials. In 2002, Congress mandated that the Transportation Security Agency (TSA) designate a subset of hazardous materials that could pose a threat to the American public when transported in sufficiently large quantities. This subset of hazardous materials, which could be weaponized or subjected to a nefarious terrorist act, was designated as Security Sensitive Hazardous Materials (SSHM). Radioactive materials (RAM) were of special concern because actionable intelligence had revealed that Al Qaeda desired to develop a homemade nuclear device or a dirty bomb to use against the United States (US) or its allies.1 Because of this clear and present danger, it is today a national priority to develop and deploy technologies that will provide for visibility and real-time exception notification of SSHM and Radioactive Materials in Quantities of Concern (RAMQC) in international commerce. Over the past eight years Oak Ridge National Laboratory (ORNL) has been developing, implementing, and deploying sensor-based technologies to enhance supply chain visibility. ORNL s research into creating a model for shipments, known as IntelligentFreight, has investigated sensors and sensor integration methods at numerous testbeds throughout the national supply chain. As a result of our research, ORNL believes that most of the information needed by supply chain partners to provide shipment visibility and exceptions-based reporting already exists but is trapped in numerous proprietary or agency-centric databases.

  10. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data autoloading capability of Django and a TEDS processing script to populate the database with the incoming data. The Python WaterOneFlow Web Services developed by the Texas Water Development Board will be used to publish the data. The software suite is being tested on the Raspberry Pi as end node and a laptop PC as the base station in a wireless setting.

  11. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    PubMed

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  12. ACL reconstruction - discharge

    MedlinePlus

    Anterior cruciate ligament reconstruction - discharge; ACL reconstruction - discharge ... had surgery to reconstruct your anterior cruciate ligament (ACL). The surgeon drilled holes in the bones of ...

  13. Primary pulmonary artery sarcoma: a new surgical technique for pulmonary artery reconstruction using a self-made stapled bovine pericardial graft conduit.

    PubMed

    Obeso Carillo, Gerardo Andrés; Casais Pampín, Rocío; Legarra Calderón, Juan José; Pradas Montilla, Gonzalo

    2015-01-01

    Primary pulmonary artery sarcoma is an uncommon neoplasm with a grim prognosis. Complete resection is the only treatment that can improve the patient's survival. The role of multimodality treatment is still controversial, although adjuvant chemotherapy possibly could improve outcomes of these patients. Several pulmonary artery reconstructive techniques have been reported in the scientific literature, such as patch reconstruction, end-to-end anastomosis, synthetic prosthesis or biological grafts. In this article, we propose a new surgical option for pulmonary artery reconstruction after radical tumour resection using a self-made stapled bovine pericardial graft conduit in a patient with a mass in the pulmonary trunk and right pulmonary artery. We believe that the use of this technique adds safety and effectiveness, and reduces the surgical time.

  14. Analysis of wavefront reconstruction in 8 meter ring solar telescope

    NASA Astrophysics Data System (ADS)

    Dai, Yichun; Jin, Zhenyu

    2016-07-01

    Chinese Giant Solar Telescope (CGST) is the next generation infrared and optical solar telescope of China, which is proposed and pushed by the solar astronomy community of China and listed into the National Plans of Major Science and Technology Infrastructures. CGST is currently proposed to be an 8 meter Ring Solar Telescope (RST) with width of 1 meter, the hollow and symmetric structure of such an annular aperture facilitates the thermal control and high precision magnetic field measurement for a solar telescope. Adaptive optics (AO) is an indispensable tool of RST to obtain diffraction limited observations. How to realize AO involved wavefront sensing and correcting, and the degree of compensating in a narrow annular aperture is the primary problem of AO implementation of RST. Wavefront reconstruction involved problems of RST are first investigated and discussed in this paper using end to end simulation based on Shack-Hartmann wavefront sensing (SHWFS). The simulation results show that performance of zonal reconstruction with measurement noise no more than 0.05 arc sec can meets the requirement of RST for diffraction-limited imaging at wavelength of 1μm, which satisfies most science cases of RST in near infrared waveband.

  15. Resection-Reconstruction of Aberrant Right Hepatic Artery During Whipple Procedure (Pancreaticoduodenectomy).

    PubMed

    Sayyed, Raza; Rehman, Iffat; Niazi, Imran Khalid; Yusuf, Muhammed Aasim; Syed, Aamir Ali; V, Faisal

    2016-06-01

    Aberrant hepatic arterial anatomy poses a challenge for the surgeon during Whipple procedure. Intraoperative injury to the aberrant vasculature results in hemorrhagic or ischemic complications involving the liver and biliary tree. We report a case of replaced right hepatic artery arising from the superior mesenteric artery in a patient with periampullary carcinoma of the pancreas, undergoing pancreaticoduodenectomy. The aberrant artery was found to be coursing through the pancreatic parenchyma. This is a rare vascular anomaly. Resection of the arterial segment and end-to-end anastomosis was fashioned. Intrapancreatic course of the replaced right hepatic artery is a rare anomaly and is best managed by preoperative identification on radiology and meticulous intra-operative dissection and preservation. However, for an intrapancreatic course, resection and reconstruction may occasionally be required.

  16. Etalon Array Reconstructive Spectrometry

    NASA Astrophysics Data System (ADS)

    Huang, Eric; Ma, Qian; Liu, Zhaowei

    2017-01-01

    Compact spectrometers are crucial in areas where size and weight may need to be minimized. These types of spectrometers often contain no moving parts, which makes for an instrument that can be highly durable. With the recent proliferation in low-cost and high-resolution cameras, camera-based spectrometry methods have the potential to make portable spectrometers small, ubiquitous, and cheap. Here, we demonstrate a novel method for compact spectrometry that uses an array of etalons to perform spectral encoding, and uses a reconstruction algorithm to recover the incident spectrum. This spectrometer has the unique capability for both high resolution and a large working bandwidth without sacrificing sensitivity, and we anticipate that its simplicity makes it an excellent candidate whenever a compact, robust, and flexible spectrometry solution is needed.

  17. Etalon Array Reconstructive Spectrometry

    PubMed Central

    Huang, Eric; Ma, Qian; Liu, Zhaowei

    2017-01-01

    Compact spectrometers are crucial in areas where size and weight may need to be minimized. These types of spectrometers often contain no moving parts, which makes for an instrument that can be highly durable. With the recent proliferation in low-cost and high-resolution cameras, camera-based spectrometry methods have the potential to make portable spectrometers small, ubiquitous, and cheap. Here, we demonstrate a novel method for compact spectrometry that uses an array of etalons to perform spectral encoding, and uses a reconstruction algorithm to recover the incident spectrum. This spectrometer has the unique capability for both high resolution and a large working bandwidth without sacrificing sensitivity, and we anticipate that its simplicity makes it an excellent candidate whenever a compact, robust, and flexible spectrometry solution is needed. PMID:28074883

  18. Enhanced compressive wideband frequency spectrum sensing for dynamic spectrum access

    NASA Astrophysics Data System (ADS)

    Liu, Yipeng; Wan, Qun

    2012-12-01

    Wideband spectrum sensing detects the unused spectrum holes for dynamic spectrum access (DSA). Too high sampling rate is the main challenge. Compressive sensing (CS) can reconstruct sparse signal with much fewer randomized samples than Nyquist sampling with high probability. Since survey shows that the monitored signal is sparse in frequency domain, CS can deal with the sampling burden. Random samples can be obtained by the analog-to-information converter. Signal recovery can be formulated as the combination of an L0 norm minimization and a linear measurement fitting constraint. In DSA, the static spectrum allocation of primary radios means the bounds between different types of primary radios are known in advance. To incorporate this a priori information, we divide the whole spectrum into sections according to the spectrum allocation policy. In the new optimization model, the minimization of the L2 norm of each section is used to encourage the cluster distribution locally, while the L0 norm of the L2 norms is minimized to give sparse distribution globally. Because the L2/L0 optimization is not convex, an iteratively re-weighted L2/L1 optimization is proposed to approximate it. Simulations demonstrate the proposed method outperforms others in accuracy, denoising ability, etc.

  19. Integration of real-time 3D capture, reconstruction, and light-field display

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaoxing; Geng, Zheng; Li, Tuotuo; Pei, Renjing; Liu, Yongchun; Zhang, Xiao

    2015-03-01

    Effective integration of 3D acquisition, reconstruction (modeling) and display technologies into a seamless systems provides augmented experience of visualizing and analyzing real objects and scenes with realistic 3D sensation. Applications can be found in medical imaging, gaming, virtual or augmented reality and hybrid simulations. Although 3D acquisition, reconstruction, and display technologies have gained significant momentum in recent years, there seems a lack of attention on synergistically combining these components into a "end-to-end" 3D visualization system. We designed, built and tested an integrated 3D visualization system that is able to capture in real-time 3D light-field images, perform 3D reconstruction to build 3D model of the objects, and display the 3D model on a large autostereoscopic screen. In this article, we will present our system architecture and component designs, hardware/software implementations, and experimental results. We will elaborate on our recent progress on sparse camera array light-field 3D acquisition, real-time dense 3D reconstruction, and autostereoscopic multi-view 3D display. A prototype is finally presented with test results to illustrate the effectiveness of our proposed integrated 3D visualization system.

  20. Deep Learning Segmentation of Optical Microscopy Images Improves 3D Neuron Reconstruction.

    PubMed

    Li, Rongjian; Zeng, Tao; Peng, Hanchuan; Ji, Shuiwang

    2017-03-08

    Digital reconstruction, or tracing, of 3-dimensional (3D) neuron structure from microscopy images is a critical step toward reversing engineering the wiring and anatomy of a brain. Despite a number of prior attempts, this task remains very challenging, especially when images are contaminated by noises or have discontinued segments of neurite patterns. An approach for addressing such problems is to identify the locations of neuronal voxels using image segmentation methods prior to applying tracing or reconstruction techniques. This preprocessing step is expected to remove noises in the data, thereby leading to improved reconstruction results. In this work, we proposed to use 3D Convolutional neural networks (CNNs) for segmenting the neuronal microscopy images. Specifically, we designed a novel CNN architecture that takes volumetric images as the inputs and their voxel-wise segmentation maps as the outputs. The developed architecture allows us to train and predict using large microscopy images in an end-to-end manner. We evaluated the performance of our model on a variety of challenging 3D microscopy images from different organisms. Results showed that the proposed methods improved the tracing performance significantly when combined with different reconstruction algorithms.

  1. Innovations in diabetic foot reconstruction using supermicrosurgery.

    PubMed

    Suh, Hyun Suk; Oh, Tae Suk; Hong, Joon Pio

    2016-01-01

    The treatment of diabetic foot ulceration is complex with multiple factors involved, and it may often lead to limb amputation. Hence, a multidisciplinary approach is warranted to cover the spectrum of treatment for diabetic foot, but in complex wounds, surgical treatment is inevitable. Surgery may involve the decision to preserve the limb by reconstruction or to amputate it. Reconstruction involves preserving the limb with secure coverage. Local flaps usually are able to provide sufficient coverage for small or moderate sized wound, but for larger wounds, soft tissue coverage involves flaps that are distantly located from the wound. Reconstruction of distant flap usually involves microsurgery, and now, further innovative methods such as supermicrosurgery have further given complex wounds a better chance to be reconstructed and limbs salvaged. This article reviews the microsurgery involved in reconstruction and introduces the new method of supermicrosurgery.

  2. Fission Spectrum

    DOE R&D Accomplishments Database

    Bloch, F.; Staub, H.

    1943-08-18

    Measurements of the spectrum of the fission neutrons of 25 are described, in which the energy of the neutrons is determined from the ionization produced by individual hydrogen recoils. The slow neutrons producing fission are obtained by slowing down the fast neutrons from the Be-D reaction of the Stanford cyclotron. In order to distinguish between fission neutrons and the remaining fast cyclotron neutrons both the cyclotron current and the pusle amplifier are modulated. A hollow neutron container, in which slow neutrons have a lifetime of about 2 milliseconds, avoids the use of large distances. This method results in much higher intensities than the usual modulation arrangement. The results show a continuous distribution of neutrons with a rather wide maximum at about 0.8 MV falling off to half of its maximum value at 2.0 MV. The total number of netrons is determined by comparison with the number of fission fragments. The result seems to indicate that only about 30% of the neutrons have energies below .8 MV. Various tests are described which were performed in order to rule out modification of the spectrum by inelastic scattering. Decl. May 4, 1951

  3. Sky reconstruction for the Tianlai cylinder array

    NASA Astrophysics Data System (ADS)

    Zhang, Jiao; Zuo, Shi-Fan; Ansari, Reza; Chen, Xuelei; Li, Yi-Chao; Wu, Feng-Quan; Campagne, Jean-Eric; Magneville, Christophe

    2016-10-01

    We apply our sky map reconstruction method for transit type interferometers to the Tianlai cylinder array. The method is based on spherical harmonic decomposition, and can be applied to a cylindrical array as well as dish arrays and we can compute the instrument response, synthesized beam, transfer function and noise power spectrum. We consider cylinder arrays with feed spacing larger than half a wavelength and, as expected, we find that the arrays with regular spacing have grating lobes which produce spurious images in the reconstructed maps. We show that this problem can be overcome using arrays with a different feed spacing on each cylinder. We present the reconstructed maps, and study the performance in terms of noise power spectrum, transfer function and beams for both regular and irregular feed spacing configurations.

  4. The stability of spectrum reproduction by LEDs

    NASA Astrophysics Data System (ADS)

    Yang, Hua; Li, Jing; Yao, Ran; Lu, Pengzhi; Pei, Yanrong

    2015-09-01

    Spectral power distribution together with color consistency and constancy of natural light is studied and simulated before the white-light LED systems are fabricated to reproduce the natural light. The model with 3, 4, 6 and more primary LEDs based on the real measured spectrum and theoretical spectrum are analyzed. The spectral power sensitivity relation between the LEDs with different wavelength and color characteristic is analyzed. This research simplifies the approach of visible spectrum reconstruction which is an efficient way to use in the design and realization of LED-based luminaire.

  5. Performance assessment of different pulse reconstruction algorithms for the ATHENA X-ray Integral Field Unit

    NASA Astrophysics Data System (ADS)

    Peille, Philippe; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; den Hartog, Roland; de Plaa, Jelle; Barret, Didier; den Herder, Jan-Willem; Piro, Luigi; Barcons, Xavier; Pointecouteau, Etienne

    2016-07-01

    The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.

  6. Towards an optimal reconstruction of baryon oscillations

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias E-mail: matiasz@ias.edu

    2012-10-01

    The Baryon Acoustic Oscillations (BAO) in the large-scale structure of the universe leave a distinct peak in the two-point correlation function of the matter distribution. That acoustic peak is smeared and shifted by bulk flows and non-linear evolution. However, it has been shown that it is still possible to sharpen the peak and remove its shift by undoing the effects of the bulk flows. We propose an improvement to the standard acoustic peak reconstruction. Contrary to the standard approach, the new scheme has no free parameters, treats the large-scale modes consistently, and uses optimal filters to extract the BAO information. At redshift of zero, the reconstructed linear matter power spectrum leads to a markedly improved sharpening of the reconstructed acoustic peak compared to standard reconstruction.

  7. Neuromagnetic source reconstruction

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.; Leahy, R.M.

    1994-12-31

    In neuromagnetic source reconstruction, a functional map of neural activity is constructed from noninvasive magnetoencephalographic (MEG) measurements. The overall reconstruction problem is under-determined, so some form of source modeling must be applied. We review the two main classes of reconstruction techniques-parametric current dipole models and nonparametric distributed source reconstructions. Current dipole reconstructions use a physically plausible source model, but are limited to cases in which the neural currents are expected to be highly sparse and localized. Distributed source reconstructions can be applied to a wider variety of cases, but must incorporate an implicit source, model in order to arrive at a single reconstruction. We examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms. We conclude with a brief discussion of alternative non-Gaussian approachs.

  8. Zygomatico-maxillary Reconstruction with Computer-aided Manufacturing of a Free DCIA Osseous Flap and Intraoral Anastomoses.

    PubMed

    Roy, Andrée-Anne; Efanov, Johnny I; Mercier-Couture, Geneviève; Chollet, André; Borsuk, Daniel E

    2017-02-01

    Craniomaxillofacial reconstruction using virtual surgical planning, computer-aided manufacturing, and new microsurgical techniques optimizes patient-specific and defect-directed reconstruction. A 3D customized free deep circumflex iliac artery (DCIA) flap with intraoral anastomoses was performed on a 23-year-old man with a posttraumatic right zygomatico-maxillary defect with failure of alloplastic implant reconstruction. An osseous iliac crest flap was sculpted based on a customized 3D model of the mirror image of the patient's unaffected side to allow for perfect fit to the zygomatico-maxillary defect. An intraoral dissection of the facial artery and vein was performed within the right cheek mucosa and allowed for end-to-end microvascular anastomoses. 3D preoperative planning and customized free DCIA osseous flap combined with an intraoral microsurgical technique provided restoration of facial esthetics and function without visible scars. In cases where zygomatico-malar reconstruction by alloplastic material fails, a customized free DCIA osseous flap can be designed by virtual surgical planning to restore facial appearance and function.

  9. Zygomatico-maxillary Reconstruction with Computer-aided Manufacturing of a Free DCIA Osseous Flap and Intraoral Anastomoses

    PubMed Central

    Roy, Andrée-Anne; Efanov, Johnny I.; Mercier-Couture, Geneviève; Chollet, André

    2017-01-01

    Summary: Craniomaxillofacial reconstruction using virtual surgical planning, computer-aided manufacturing, and new microsurgical techniques optimizes patient-specific and defect-directed reconstruction. A 3D customized free deep circumflex iliac artery (DCIA) flap with intraoral anastomoses was performed on a 23-year-old man with a posttraumatic right zygomatico-maxillary defect with failure of alloplastic implant reconstruction. An osseous iliac crest flap was sculpted based on a customized 3D model of the mirror image of the patient’s unaffected side to allow for perfect fit to the zygomatico-maxillary defect. An intraoral dissection of the facial artery and vein was performed within the right cheek mucosa and allowed for end-to-end microvascular anastomoses. 3D preoperative planning and customized free DCIA osseous flap combined with an intraoral microsurgical technique provided restoration of facial esthetics and function without visible scars. In cases where zygomatico-malar reconstruction by alloplastic material fails, a customized free DCIA osseous flap can be designed by virtual surgical planning to restore facial appearance and function. PMID:28280668

  10. Breast Reconstruction after Mastectomy

    PubMed Central

    Schmauss, Daniel; Machens, Hans-Günther; Harder, Yves

    2016-01-01

    Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays, breast reconstruction should be individualized at its best, first of all taking into consideration not only the oncological aspects of the tumor, neo-/adjuvant treatment, and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction), as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue), the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction. PMID:26835456

  11. Breast reconstruction - natural tissue

    MedlinePlus

    ... After a mastectomy , some women choose to have cosmetic surgery to remake their breast. This type of surgery ... augmentation surgery Breast reconstruction - implants Mastectomy Patient Instructions Cosmetic breast surgery - discharge Mastectomy and breast reconstruction - what to ask ...

  12. Breast Reconstruction with Implants

    MedlinePlus

    ... removes your breast to treat or prevent breast cancer. One type of breast reconstruction uses breast implants — silicone devices filled with silicone gel or salt water (saline) — to reshape your breasts. Breast reconstruction ...

  13. Methods of Voice Reconstruction

    PubMed Central

    Chen, Hung-Chi; Kim Evans, Karen F.; Salgado, Christopher J.; Mardini, Samir

    2010-01-01

    This article reviews methods of voice reconstruction. Nonsurgical methods of voice reconstruction include electrolarynx, pneumatic artificial larynx, and esophageal speech. Surgical methods of voice reconstruction include neoglottis, tracheoesophageal puncture, and prosthesis. Tracheoesophageal puncture can be performed in patients with pedicled flaps such as colon interposition, jejunum, or gastric pull-up or in free flaps such as perforator flaps, jejunum, and colon flaps. Other flaps for voice reconstruction include the ileocolon flap and jejunum. Laryngeal transplantation is also reviewed. PMID:22550443

  14. Reoperative midface reconstruction.

    PubMed

    Acero, Julio; García, Eloy

    2011-02-01

    Reoperative reconstruction of the midface is a challenging issue because of the complexity of this region and the severity of the aesthetic and functional sequela related to the absence or failure of a primary reconstruction. The different situations that can lead to the indication of a reoperative reconstructive procedure after previous oncologic ablative procedures in the midface are reviewed. Surgical techniques, anatomic problems, and limitations affecting the reoperative reconstruction in this region of the head and neck are discussed.

  15. Performances of JEM-EUSO: angular reconstruction. The JEM-EUSO Collaboration

    NASA Astrophysics Data System (ADS)

    Adams, J. H.; Ahmad, S.; Albert, J.-N.; Allard, D.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Arai, Y.; Asano, K.; Ave Pernas, M.; Baragatti, P.; Barrillon, P.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Blaksley, C.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Blümer, J.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Briggs, M. S.; Briz, S.; Bruno, A.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellinic, G.; Catalano, C.; Catalano, G.; Cellino, A.; Chikawa, M.; Christl, M. J.; Cline, D.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; de Castro, A. J.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Dell'Oro, A.; De Simone, N.; Di Martino, M.; Distratis, G.; Dulucq, F.; Dupieux, M.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Falk, S.; Fang, K.; Fenu, F.; Fernández-Gómez, I.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Franceschi, A.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; Garipov, G.; Geary, J.; Gelmini, G.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guzmán, A.; Hachisu, Y.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Insolia, A.; Isgrò, F.; Itow, Y.; Joven, E.; Judd, E. G.; Jung, A.; Kajino, F.; Kajino, T.; Kaneko, I.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Keilhauer, B.; Khrenov, B. A.; Kim, J.-S.; Kim, S.-W.; Kim, S.-W.; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lee, J.; Licandro, J.; Lim, H.; López, F.; Maccarone, M. C.; Mannheim, K.; Maravilla, D.; Marcelli, L.; Marini, A.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Medina-Tanco, G.; Mernik, T.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Murakami, M. Nagano; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Panasyuk, M. I.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perez Cano, S.; Peter, T.; Picozza, P.; Pierog, T.; Piotrowski, L. W.; Piraino, S.; Plebaniak, Z.; Pollini, A.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Reardon, P.; Reyes, M.; Ricci, M.; Rodríguez, I.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez-Cano, G.; Sagawa, H.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sánchez, S.; Santangelo, A.; Santiago Crúz, L.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Silva López, H. H.; Sledd, J.; Słomińska, K.; Sobey, A.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Trillaud, F.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Valore, L.; Vankova, G.; Vigorito, C.; Villaseñor, L.; von Ballmoos, P.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J.; Weber, M.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, K.; Yoshida, S.; Young, R.; Zotov, M. Yu.; Zuccaro Marchi, A.

    2015-11-01

    Mounted on the International Space Station(ISS), the Extreme Universe Space Observatory, on-board the Japanese Experimental Module (JEM-EUSO), relies on the well established fluorescence technique to observe Extensive Air Showers (EAS) developing in the earth's atmosphere. Focusing on the detection of Ultra High Energy Cosmic Rays (UHECR) in the decade of 1020eV, JEM-EUSO will face new challenges by applying this technique from space. The EUSO Simulation and Analysis Framework (ESAF) has been developed in this context to provide a full end-to-end simulation frame, and assess the overall performance of the detector. Within ESAF, angular reconstruction can be separated into two conceptually different steps. The first step is pattern recognition, or filtering, of the signal to separate it from the background. The second step is to perform different types of fitting in order to search for the relevant geometrical parameters that best describe the previously selected signal. In this paper, we discuss some of the techniques we have implemented in ESAF to perform the geometrical reconstruction of EAS seen by JEM-EUSO. We also conduct thorough tests to assess the performances of these techniques in conditions which are relevant to the scope of the JEM-EUSO mission. We conclude by showing the expected angular resolution in the energy range that JEM-EUSO is expected to observe.

  16. Reconstruction of muon tracks in a buried plastic scintillator muon telescope (BATATA)

    NASA Astrophysics Data System (ADS)

    Riggi, S.; Insolia, A.; Medina-Tanco, G.; Trovato, E.

    2012-10-01

    The BATATA muon counter was designed as one of the foreseen detector upgrades of the Pierre Auger Observatory with the main goal of quantifying the electromagnetic contamination of the muon signal as a function of the depth for cosmic ray shower energies above 10 PeV. Nevertheless BATATA offers also the possibility of measuring the incoming direction of secondary muons from both GeV and PeV primary cosmic rays. Large efforts have been already done to quantify from simulations the amount of the electromagnetic contamination and the expected muon identification performances. The present work is focused on the evaluation of the detector performances for muon track reconstruction. To this aim and in view of the detector installation in the field, expected to be completed by the first half of current year, we performed a GEANT4 end-to-end simulation of such device and set up a track reconstruction procedure. Typical results concerning achieved acceptance and angular resolution for muons are presented.

  17. Therapeutic Anticoagulant Does not Modify Thromboses Rate Vein after Venous Reconstruction Following Pancreaticoduodenectomy

    PubMed Central

    Ouaïssi, Mehdi; Sielezneff, Igor; Pirro, Nicolas; Bon Mardion, Rémi; Chaix, Jean Batiste; Merad, Abdelrhame; Berdah, Stéphane; Moutardier, Vincent; Cresti, Silvia; Emungania, Olivier; Anderson, Loundou; Christian, Brunet; Bernard, Sastre

    2008-01-01

    Recommendations for anticoagulation following major venous reconstruction for pancreatic adenocarcinoma (PA) are not clearly established. The aim of our study was to find out the relation between postoperative anticoagulant treatment and thrombosis rate after portal venous resection. Materials and methods. Between 1986 and 2006, twenty seven portal vein resections were performed associated with pancreaticoduodenectomies (n = 27) (PD).We defined four types of venous resection: type I was performed 1 cm above the confluent of the superior mesenteric vein (SMV) (n = 12); type II lateral resection and venorrhaphy at the level of the confluent SMV (n = 12); type III (n = 1) resulted from a primary end-to-end anastomosis above confluent and PTFE graph was used for reconstruction for type IV (n = 2). Curative anticoagulant treatment was always indicated after type IV (n = 2) resection, and after resection of type II when the length of venous resection was longer than ≥2 cm. Results. Venous thrombosis rate reached: 0%, 41%, and 100% for type I, II, IV resections, respectively. Among them four patients received curative anticoagulant treatment. Conclusion. After a portal vein resection was achieved in the course of a PD, curative postoperative anticoagulation does not prevent efficiently the onset of thrombosis. PMID:19043605

  18. [Breast reconstruction after mastectomy].

    PubMed

    Ho Quoc, C; Delay, E

    2013-02-01

    The mutilating surgery for breast cancer causes deep somatic and psychological sequelae. Breast reconstruction can mitigate these effects and permit the patient to help rebuild their lives. The purpose of this paper is to focus on breast reconstruction techniques and on factors involved in breast reconstruction. The methods of breast reconstruction are presented: objectives, indications, different techniques, operative risks, and long-term monitoring. Many different techniques can now allow breast reconstruction in most patients. Clinical cases are also presented in order to understand the results we expect from a breast reconstruction. Breast reconstruction provides many benefits for patients in terms of rehabilitation, wellness, and quality of life. In our mind, breast reconstruction should be considered more as an opportunity and a positive choice (the patient can decide to do it), than as an obligation (that the patient would suffer). The consultation with the surgeon who will perform the reconstruction is an important step to give all necessary informations. It is really important that the patient could speak again with him before undergoing reconstruction, if she has any doubt. The quality of information given by medical doctors is essential to the success of psychological intervention. This article was written in a simple, and understandable way to help gynecologists giving the best information to their patients. It is maybe also possible to let them a copy of this article, which would enable them to have a written support and would facilitate future consultation with the surgeon who will perform the reconstruction.

  19. Reconstruction of missing data using iterative harmonic expansion

    NASA Astrophysics Data System (ADS)

    Nishizawa, Atsushi J.; Inoue, Kaiki Taro

    2016-10-01

    In the cosmic microwave background or galaxy density maps, missing fluctuations in masked regions can be reconstructed from fluctuations in the surrounding unmasked regions if the original fluctuations are sufficiently smooth. One reconstruction method involves applying a harmonic expansion iteratively to fluctuations in the unmasked region. In this paper, we discuss how well this reconstruction method can recover the original fluctuations depending on the prior of fluctuations and property of the masked region. The reconstruction method is formulated with an asymptotic expansion in terms of the size of mask for a fixed iteration number. The reconstruction accuracy depends on the mask size, the spectrum of the underlying density fluctuations, the scales of the fluctuations to be reconstructed and the number of iterations. For Gaussian fluctuations with the Harrison-Zel'dovich spectrum, the reconstruction method provides more accurate restoration than naive methods based on brute-force matrix inversion or the singular value decomposition. We also demonstrate that an isotropic non-Gaussian prior does not change the results but an anisotropic non-Gaussian prior can yield a higher reconstruction accuracy compared to the Gaussian prior case.

  20. Inflationary dynamics reconstruction via inverse-scattering theory

    NASA Astrophysics Data System (ADS)

    Mastache, Jorge; Zago, Fernando; Kosowsky, Arthur

    2017-03-01

    The evolution of inflationary fluctuations can be recast as an inverse scattering problem. In this context, we employ the Gel'fand-Levitan method from inverse-scattering theory to reconstruct the evolution of both the inflaton field freeze-out horizon and the Hubble parameter during inflation. We demonstrate this reconstruction procedure numerically for a scenario of slow-roll inflation, as well as for a scenario which temporarily departs from slow-roll. The field freeze-out horizon is reconstructed from the accessible primordial scalar power spectrum alone, while the reconstruction of the Hubble parameter requires additional information from the tensor power spectrum. We briefly discuss the application of this technique to more realistic cases incorporating estimates of the primordial power spectra over limited ranges of scales and with specified uncertainties.

  1. Modern reconstructive techniques for abdominal wall defects after oncologic resection.

    PubMed

    Khansa, Ibrahim; Janis, Jeffrey E

    2015-04-01

    Resection of abdominal wall tumors often leaves patients with debilitating soft tissue defects. Modern reconstructive techniques can be used to restore abdominal wall integrity. In this article, we present an overview of preoperative patient evaluation, analysis of the defect, surgical planning, and the spectrum of available surgical techniques, ranging from simple to complex. The established clinical evidence in the field of abdominal wall reconstruction is summarized and a case example is provided.

  2. Anatomical study of the cavernous sinus emphasizing operative approaches and related vascular and neural reconstruction.

    PubMed

    Sekhar, L N; Burgess, J; Akin, O

    1987-12-01

    The efficacy of three operative approaches to the cavernous sinus (CS) and the possibilities of vascular and cranial nerve reconstruction in and around the CS were studied in 50 cadaver specimens (25 heads). The lateral operative approach was through the lateral wall, between Cranial Nerves V1 and IV, or between Cranial Nerves V1 and V2. The superior approach was through the superior wall of the CS after removing the anterior clinoid process and unroofing the optic canal. The inferior approach followed the petrous internal carotid artery (ICA) into the CS after an extradural subtemporal exposure or after a combined subtemporal and infratemporal fossa exposure. The different exposures of the spaces of the CS and of the intracavernous structures provided by the superior and the lateral approaches were complementary. The exposure provided by the inferior approach was minimal; however, the junction of the petrous and cavernous ICA was best exposed by this route. The combined subtemporal and infratemporal fossa approach exposed the petrous ICA (for proximal control or for reconstruction) with the greatest ease and with the least temporal lobe retraction. The combination of the superior and lateral approaches and the complete mobilization of the intracavernous ICA facilitated its repair after experimental lacerations. Lacerations of either the inferior and the inferomedial aspects of any portion of the cavernous ICA or of the anterior surface of the posterior vertical segment of the artery were the most difficult to repair. End-to-end anastomosis was more difficult with the posterior third of the artery than with the anterior two-thirds. A vein graft with an average length of 3.5 cm could be sutured from the petrous to the supraclinoid ICA to bypass the cavernous ICA, with an average occlusion time of 45 minutes. End-to-end technique was judged better for the proximal anastomosis, but end (graft)-to-side anastomosis was easier to perform at the distal end because of the

  3. Surgical reconstruction of TMJ.

    PubMed

    Ramil Novo, V M; Garcìa, A G; Berini Aytès, L; Escoda, C G

    1999-01-01

    Certain situations and pathological processes that arise with temporomandibular joint destruction can only be resolved with surgical reconstructive procedures in order to attempt a functional and anatomical rehabilitation of this joint. Many of these situations can be surgically treated with the patient's own autologous tissues. However, in some patients reconstruction is complex and the use of autologous tissues is unadvisable whereas reconstruction utilizing alloplastic materials may be an appropriate alternative. The following report describes 4 clinical cases in which autologous grafts or Christensen joint prosthesis are employed in temporomandibular joint reconstruction.

  4. Model-based x-ray energy spectrum estimation algorithm from CT scanning data with spectrum filter

    NASA Astrophysics Data System (ADS)

    Li, Lei; Wang, Lin-Yuan; Yan, Bin

    2016-10-01

    With the development of technology, the traditional X-ray CT can't meet the modern medical and industry needs for component distinguish and identification. This is due to the inconsistency of X-ray imaging system and reconstruction algorithm. In the current CT systems, X-ray spectrum produced by X-ray source is continuous in energy range determined by tube voltage and energy filter, and the attenuation coefficient of object is varied with the X-ray energy. So the distribution of X-ray energy spectrum plays an important role for beam-hardening correction, dual energy CT image reconstruction or dose calculation. However, due to high ill-condition and ill-posed feature of system equations of transmission measurement data, statistical fluctuations of X ray quantum and noise pollution, it is very hard to get stable and accurate spectrum estimation using existing methods. In this paper, a model-based X-ray energy spectrum estimation method from CT scanning data with energy spectrum filter is proposed. First, transmission measurement data were accurately acquired by CT scan and measurement using phantoms with different energy spectrum filter. Second, a physical meaningful X-ray tube spectrum model was established with weighted gaussian functions and priori information such as continuity of bremsstrahlung and specificity of characteristic emission and estimation information of average attenuation coefficient. The parameter in model was optimized to get the best estimation result for filtered spectrum. Finally, the original energy spectrum was reconstructed from filtered spectrum estimation with filter priori information. Experimental results demonstrate that the stability and accuracy of X ray energy spectrum estimation using the proposed method are improved significantly.

  5. Rolling Contact Force Energy Reconstruction

    NASA Astrophysics Data System (ADS)

    BRACCIALI, A.; CASCINI, G.

    2000-09-01

    Knowledge of the forces at the wheel-rail contact is fundamental to estimate the consequences in terms of noise and vibration. The traditional use of strain gauges mounted on the wheel web and axle is not capable of determining the high-frequency content of the contact force. Measurements made on the rail are characterized by the spatial variability of input-output transfer functions which makes it difficult to estimate the contact force by simple inversion of the point frequency response function. In this study the problem of rolling contact force reconstruction has been approached through the following steps: (i) the track has been characterized precisely for a finite length by the analysis of the time series of several impacts supplied with an instrumented hammer by using an ARMAX model that proved to be capable of modelling the vertical dynamics of the rail up to 5 kHz; (ii) the response of the rail has been simulated with a random force acting on the system, and the variability of the transfer function has been taken into account by distributing the force on adjacent elements; (iii) the simulated response has been compared with the rail acceleration measured for the passage of several trains; (iv) the wheel-rail contact force has been estimated with a closed-loop algorithm. It has thus been possible to reconstruct the13octave power spectrum of contact forces with a simple and stable iterative procedure. Forces reconstructed from different sensors were found to be practically the same for a given wheel; forces from nominally similar wheels are statistically examined and partial results of comparisons made on different rolling stock are shown.

  6. Attractor reconstruction for non-linear systems: a methodological note

    USGS Publications Warehouse

    Nichols, J.M.; Nichols, J.D.

    2001-01-01

    Attractor reconstruction is an important step in the process of making predictions for non-linear time-series and in the computation of certain invariant quantities used to characterize the dynamics of such series. The utility of computed predictions and invariant quantities is dependent on the accuracy of attractor reconstruction, which in turn is determined by the methods used in the reconstruction process. This paper suggests methods by which the delay and embedding dimension may be selected for a typical delay coordinate reconstruction. A comparison is drawn between the use of the autocorrelation function and mutual information in quantifying the delay. In addition, a false nearest neighbor (FNN) approach is used in minimizing the number of delay vectors needed. Results highlight the need for an accurate reconstruction in the computation of the Lyapunov spectrum and in prediction algorithms.

  7. Beam hardening correction for sparse-view CT reconstruction

    NASA Astrophysics Data System (ADS)

    Liu, Wenlei; Rong, Junyan; Gao, Peng; Liao, Qimei; Lu, HongBing

    2015-03-01

    Beam hardening, which is caused by spectrum polychromatism of the X-ray beam, may result in various artifacts in the reconstructed image and degrade image quality. The artifacts would be further aggravated for the sparse-view reconstruction due to insufficient sampling data. Considering the advantages of the total-variation (TV) minimization in CT reconstruction with sparse-view data, in this paper, we propose a beam hardening correction method for sparse-view CT reconstruction based on Brabant's modeling. In this correction model for beam hardening, the attenuation coefficient of each voxel at the effective energy is modeled and estimated linearly, and can be applied in an iterative framework, such as simultaneous algebraic reconstruction technique (SART). By integrating the correction model into the forward projector of the algebraic reconstruction technique (ART), the TV minimization can recover images when only a limited number of projections are available. The proposed method does not need prior information about the beam spectrum. Preliminary validation using Monte Carlo simulations indicates that the proposed method can provide better reconstructed images from sparse-view projection data, with effective suppression of artifacts caused by beam hardening. With appropriate modeling of other degrading effects such as photon scattering, the proposed framework may provide a new way for low-dose CT imaging.

  8. Reconstruction of Japanese Vowels.

    ERIC Educational Resources Information Center

    Aoki, Haruo

    1972-01-01

    This paper discusses the relationship between linguistic reconstructions and their historical validity using the case of Old Japanese (8th century A.D.) vowels as an example. Reconstructions throughout the paper include only those cases in which the modern reflexes and phonological correspondences between two or more genetically related languages…

  9. Education for Reconstruction.

    ERIC Educational Resources Information Center

    Phillips, David; And Others

    This report describes the main questions that various international agencies must address in order to reconstruct education in countries that have experienced crisis. "Crisis" is defined as war, natural disaster, and extreme political and economic upheaval. Many of the problems of educational reconstruction with which the Allies contended in…

  10. Identify bipolar spectrum disorders.

    PubMed

    Mynatt, Sarah; Cunningham, Patricia; Manning, J Sloan

    2002-06-01

    Patients with bipolar spectrum disorders commonly present with depressive symptoms to primary care clinicians. This article details bipolar spectrum disorder assessment, treatment, and treatment response. By intervening early in the course of depressive and hypomanic episodes, you can help decrease the morbidity and suffering associated with bipolar spectrum disorders.

  11. Sequencing-by-hybridization revisited: the analog-spectrum proposal.

    PubMed

    Preparata, Franco P

    2004-01-01

    All published approaches to DNA sequencing by hybridization (SBH) consist of the biochemical acquisition of the spectrum of a target sequence (the set of its subsequences conforming to a given probing pattern) followed by the algorithmic reconstruction of the sequence from its spectrum. In the "standard" or "uniform" approach, the probing pattern is a string of length L and the length of reliably reconstructible sequences is known to be mlen = O(2(L)). For a fixed microarray area, higher sequencing performance can be achieved by inserting nonprobing gaps ("wild-cards") in the probing pattern. The reconstruction, however, must cope with the emergence of fooling probes due to the gaps and algorithmic failure occurs when the spectrum becomes too densely populated, although we can achieve mcomp = 0(4(L)). Despite the combinatorial success of gapped probing, all current approaches are based on a biochemically unrealistic spectrum-acquisition model (digital-spectrum). The reality of hybridization is much more complex. Departing from the conventional model, in this paper, we propose an alternative, called the analog-spectrum model, which more closely reflects the biochemical process. This novel modeling reestablishes probe length as the performance-governing factor, adopting "semidegenerate bases" as suitable emulators of currently inadequate universal bases. One important conclusion is that accurate biochemical measurements are pivotal to the success of SBH. The theoretical proposal presented in this paper should be a convincing stimulus for the needed biotechnological work.

  12. Reconstruction of the Mars Science Laboratory Parachute Performance and Comparison to the Descent Simulation

    NASA Technical Reports Server (NTRS)

    Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Adams, Douglas S.; Kipp, Devin M.

    2013-01-01

    The Mars Science Laboratory used a single mortar-deployed disk-gap-band parachute of 21.35 m nominal diameter to assist in the landing of the Curiosity rover on the surface of Mars. The parachute system s performance on Mars has been reconstructed using data from the on-board inertial measurement unit, atmospheric models, and terrestrial measurements of the parachute system. In addition, the parachute performance results were compared against the end-to-end entry, descent, and landing (EDL) simulation created to design, develop, and operate the EDL system. Mortar performance was nominal. The time from mortar fire to suspension lines stretch (deployment) was 1.135 s, and the time from suspension lines stretch to first peak force (inflation) was 0.635 s. These times were slightly shorter than those used in the simulation. The reconstructed aerodynamic portion of the first peak force was 153.8 kN; the median value for this parameter from an 8,000-trial Monte Carlo simulation yielded a value of 175.4 kN - 14% higher than the reconstructed value. Aeroshell dynamics during the parachute phase of EDL were evaluated by examining the aeroshell rotation rate and rotational acceleration. The peak values of these parameters were 69.4 deg/s and 625 deg/sq s, respectively, which were well within the acceptable range. The EDL simulation was successful in predicting the aeroshell dynamics within reasonable bounds. The average total parachute force coefficient for Mach numbers below 0.6 was 0.624, which is close to the pre-flight model nominal drag coefficient of 0.615.

  13. Anterolateral Ligament Reconstruction

    PubMed Central

    Zordan, J.; Etcheto, H. Rivarola; Blanchod, C. Collazo; Palanconi, M.; Salinas, E. Álvarez; Autorino, CM; Escobar, G.

    2017-01-01

    Anterior cruciate ligament (ACL) reconstruction is a common procedure in daily practice with 75 to 97% excellent long-term results. But in certain cases, some patients perceive rotational instability, for this reason the revision rate can be 10 to 15%. Objectives: evaluate functional outcome in revisions of ACL reconstruction associated with ALL. Methods: Between July 2015 and February 2016 (11 knees) Eleven Revision ACL reconstruction were performed with ALL with double incision technique performed by the same surgical team. Inclusion criteria were: ACL reconstruction failures with a grade 2 or 3 Lachman test, a grade 3 pivot-shift without other ligamentary injury lesions associated and complete range of motion. Results: The concept of rotational instability associated with ACL injury has been described more than a decade ago. However, there is no consensus on how to quantify rotational instability in ACL injuries; so when associating an extracapsular technique. Currently there is a lack of high-level evidence comparing isolated ACL repair and associated with the modified reconstruction of ALL that allows us to define therapeutic approaches. The ALL reconstruction associate an ACL reconstruction remains a matter of study. Conclusion: We obtain excellent results in antero – posterior and rotational stability after performing the procedure.

  14. Pinnacle3 modeling and end-to-end dosimetric testing of a Versa HD linear accelerator with the Agility head and flattening filter-free modes.

    PubMed

    Saenz, Daniel L; Narayanasamy, Ganesh; Cruz, Wilbert; Papanikolaou, Nikos; Stathakis, Sotirios

    2016-01-08

    The Elekta Versa HD incorporates a variety of upgrades to the line of Elekta linear accelerators, primarily including the Agility head and flattening filter-free (FFF) photon beam delivery. The completely distinct dosimetric output of the head from its predecessors, combined with the FFF beams, requires a new investigation of modeling in treatment planning systems. A model was created in Pinnacle3 v9.8 with the commissioned beam data. A phantom consisting of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle3, where beams of different field sizes, source-to-surface distances (SSDs), wedges, and gantry angles were devised. Beams included all of the available photon energies (6, 10, 18, 6FFF, and 10 FFF MV), as well as the four electron energies commissioned for clinical use (6, 9, 12, and 15 MeV). The plans were verified at calculation points by measurement with a calibrated ionization chamber. Homogeneous and hetero-geneous point-dose measurements agreed within 2% relative to maximum dose for all photon and electron beams. AP photon open field measurements along the central axis at 100 cm SSD passed within 1%. In addition, IMRT testing was also performed with three standard plans (step and shoot IMRT, as well as a small- and large-field VMAT plan). The IMRT plans were delivered on the Delta4 IMRT QA phantom, for which a gamma passing rate was > 99.5% for all plans with a 3% dose deviation, 3 mm distance-to-agreement, and 10% dose threshold. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4% ± 2.3%. Such testing ensures confidence in the ability of Pinnacle3 to model photon and electron beams with the Agility head.

  15. Structure and Dynamics of End-to-End Loop Formation of the Penta-Peptide Cys-Ala-Gly-Gln-Trp in Implicit Solvents

    DTIC Science & Technology

    2009-01-01

    reproducing the backbone root-mean- square deviation from the native structure, the number of hydrogen bonds retained in the simulation, and the...calculations without any cutoff distances. Covalent bonds between the heavy atoms and hydrogens were constrained by the SHAKE algorithm.54 The fully...as the distance between the sulfur atom of the initial cysteine side chain and the closest non- hydrogen atom of the tryptophan indole ring. We defined

  16. Wide Area Recovery and Resiliency Program (WARRP) Biological Attack Response and Recovery: End to End Medical Countermeasure Distribution and Dispensing Processes

    DTIC Science & Technology

    2012-04-24

    evidence to support it. 64 Detect and Characterize Event No integration between national biosurveillance systems. Could receive disparate signals...resources needed to support data integration and shared analytical capacity." - U.S. Government Accountability Office. ― Biosurveillance : Efforts to...Develop a National Biosurveillance Capability Need a National Strategy and a Designated Leader‖. Washington, DC: Jun 2010. GAO-10-645. Web. http

  17. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    SciTech Connect

    Lin, M; Feigenberg, S

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  18. End-to-End System Test of the Relative Precision and Stability of the Photometric Method for Detecting Earth-Size Extrasolar Planets

    NASA Technical Reports Server (NTRS)

    Dunham, Edward W.

    2000-01-01

    We developed the CCD camera system for the laboratory test demonstration and designed the optical system for this test. The camera system was delivered to Ames in April, 1999 with continuing support mostly in the software area as the test progressed. The camera system has been operating successfully since delivery. The optical system performed well during the test. The laboratory demonstration activity is now nearly complete and is considered to be successful by the Technical Advisory Group, which met on 8 February, 2000 at the SETI Institute. A final report for the Technical Advisory Group and NASA Headquarters will be produced in the next few months. This report will be a comprehensive report on all facets of the test including those covered under this grant. A copy will be forwarded, if desired, when it is complete.

  19. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management †

    PubMed Central

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-01-01

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario. PMID:26087372

  20. PICASSO: an end-to-end image simulation tool for space and airborne imaging systems II. Extension to the thermal infrared: equations and methods

    NASA Astrophysics Data System (ADS)

    Cota, Stephen A.; Lomheim, Terrence S.; Florio, Christopher J.; Harbold, Jeffrey M.; Muto, B. Michael; Schoolar, Richard B.; Wintz, Daniel T.; Keller, Robert A.

    2011-10-01

    In a previous paper in this series, we described how The Aerospace Corporation's Parameterized Image Chain Analysis & Simulation SOftware (PICASSO) tool may be used to model space and airborne imaging systems operating in the visible to near-infrared (VISNIR). PICASSO is a systems-level tool, representative of a class of such tools used throughout the remote sensing community. It is capable of modeling systems over a wide range of fidelity, anywhere from conceptual design level (where it can serve as an integral part of the systems engineering process) to as-built hardware (where it can serve as part of the verification process). In the present paper, we extend the discussion of PICASSO to the modeling of Thermal Infrared (TIR) remote sensing systems, presenting the equations and methods necessary to modeling in that regime.

  1. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    PubMed

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-06-16

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  2. Auto-Alignment Of A Three-Mirror Off-Axis Telescope By Reverse Optimization And End-To-End Aberration Measurements

    NASA Astrophysics Data System (ADS)

    Jeong, Hwan J.; Lawrence, George N.; Nahm, Kie B.

    1989-02-01

    Butt coupling efficiencies reported to date for conventional circular core optical fiber have been low due to the mismatch between the emission profiles of the laser diode and the optical fiber. Attempts to use lenses to increase the coupling efficiency has resulted in the degradation of the laser performance due to external optical feedback from the lenses or from the front end of the fiber. If the fiber is bonded directly to the front facet of the laser, the external optical feedback from the front facet of the fiber is eliminated. A higher coupling efficiency is realized through the use of a non-circular core optical fiber, such as an elliptical core fiber, which more closely matches the highly divergent, elliptically shaped, laser diode emission profile.

  3. End-to-end crosstalk within the hepatitis C virus genome mediates the conformational switch of the 3′X-tail region

    PubMed Central

    Romero-López, Cristina; Barroso-delJesus, Alicia; García-Sacristán, Ana; Briones, Carlos; Berzal-Herranz, Alfredo

    2014-01-01

    The hepatitis C virus (HCV) RNA genome contains multiple structurally conserved domains that make long-distance RNA–RNA contacts important in the establishment of viral infection. Microarray antisense oligonucelotide assays, improved dimethyl sulfate probing methods and 2′ acylation chemistry (selective 2’-hydroxyl acylation and primer extension, SHAPE) showed the folding of the genomic RNA 3′ end to be regulated by the internal ribosome entry site (IRES) element via direct RNA–RNA interactions. The essential cis-acting replicating element (CRE) and the 3′X-tail region adopted different 3D conformations in the presence and absence of the genomic RNA 5′ terminus. Further, the structural transition in the 3′X-tail from the replication-competent conformer (consisting of three stem-loops) to the dimerizable form (with two stem-loops), was found to depend on the presence of both the IRES and the CRE elements. Complex interplay between the IRES, the CRE and the 3′X-tail region would therefore appear to occur. The preservation of this RNA–RNA interacting network, and the maintenance of the proper balance between different contacts, may play a crucial role in the switch between different steps of the HCV cycle. PMID:24049069

  4. 'End to end' planktonic trophic web and its implications for the mussel farms in the Mar Piccolo of Taranto (Ionian Sea, Italy).

    PubMed

    Karuza, Ana; Caroppo, Carmela; Monti, Marina; Camatti, Elisa; Di Poi, Elena; Stabili, Loredana; Auriemma, Rocco; Pansera, Marco; Cibic, Tamara; Del Negro, Paola

    2016-07-01

    The Mar Piccolo is a semi-enclosed basin subject to different natural and anthropogenic stressors. In order to better understand plankton dynamics and preferential carbon pathways within the planktonic trophic web, an integrated approach was adopted for the first time by examining all trophic levels (virioplankton, the heterotrophic and phototrophic fractions of pico-, nano- and microplankton, as well as mesozooplankton). Plankton abundance and biomass were investigated during four surveys in the period 2013-2014. Beside unveiling the dynamics of different plankton groups in the Mar Piccolo, the study revealed that high portion of the plankton carbon (C) pool was constituted by small-sized (<2 μm) planktonic fractions. The prevalence of small-sized species within micro- and mesozooplankton communities was observed as well. The succession of planktonic communities was clearly driven by the seasonality, i.e. by the nutrient availability and physical features of the water column. Our hypothesis is that beside the 'bottom-up' control and the grazing pressure, inferred from the C pools of different plankton groups, the presence of mussel farms in the Mar Piccolo exerts a profound impact on plankton communities, not only due to the important sequestration of the plankton biomass but also by strongly influencing its structure.

  5. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    SciTech Connect

    Huang, L; Sarkar, V; Spiessens, S; Rassiah-Szegedi, P; Huang, Y; Salter, B; Zhao, H; Szegedi, M

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  6. Operating performance of the gamma-ray Cherenkov telescope: An end-to-end Schwarzschild-Couder telescope prototype for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Dournaux, J. L.; De Franco, A.; Laporte, P.; White, R.; Greenshaw, T.; Sol, H.; Abchiche, A.; Allan, D.; Amans, J. P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J. J.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M.; De Frondat, F.; Dumas, D.; Ernenwein, J. P.; Fasola, G.; Funk, S.; Gaudemard, J.; Graham, J. A.; Gironnet, J.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J. M.; Jégouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Lapington, J. S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Morhrmann, L.; Molnyeux, P.; Nolan, S. J.; Okumura, A.; Parsons, R. D.; Ross, D.; Rowell, G.; Sato, Y.; Sayède, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Vink, J.; Watson, J.; Yamane, N.; Zech, A.; Zink, A.

    2017-02-01

    The Cherenkov Telescope Array (CTA) consortium aims to build the next-generation ground-based very-high-energy gamma-ray observatory. The array will feature different sizes of telescopes allowing it to cover a wide gamma-ray energy band from about 20 GeV to above 100 TeV. The highest energies, above 5 TeV, will be covered by a large number of Small-Sized Telescopes (SSTs) with a field-of-view of around 9°. The Gamma-ray Cherenkov Telescope (GCT), based on Schwarzschild-Couder dual-mirror optics, is one of the three proposed SST designs. The GCT is described in this contribution and the first images of Cherenkov showers obtained using the telescope and its camera are presented. These were obtained in November 2015 in Meudon,

  7. EXSdetect: an end-to-end software for extended source detection in X-ray images: application to Swift-XRT data

    NASA Astrophysics Data System (ADS)

    Liu, T.; Tozzi, P.; Tundo, E.; Moretti, A.; Wang, J.-X.; Rosati, P.; Guglielmetti, F.

    2013-01-01

    Aims: We present a stand-alone software (named EXSdetect) for the detection of extended sources in X-ray images. Our goal is to provide a flexible tool capable of detecting extended sources down to the lowest flux levels attainable within instrumental limitations, while maintaining robust photometry, high completeness, and low contamination, regardless of source morphology. EXSdetect was developed mainly to exploit the ever-increasing wealth of archival X-ray data, but is also ideally suited to explore the scientific capabilities of future X-ray facilities, with a strong focus on investigations of distant groups and clusters of galaxies. Methods: EXSdetect combines a fast Voronoi tessellation code with a friends-of-friends algorithm and an automated deblending procedure. The values of key parameters are matched to fundamental telescope properties such as angular resolution and instrumental background. In addition, the software is designed to permit extensive tests of its performance via simulations of a wide range of observational scenarios. Results: We applied EXSdetect to simulated data fields modeled to realistically represent the Swift X-ray Cluster Survey (SXCS), which is based on archival data obtained by the X-ray telescope onboard the Swift satellite. We achieve more than 90% completeness for extended sources comprising at least 80 photons in the 0.5-2 keV band, a limit that corresponds to 10-14 erg cm-2 s-1 for the deepest SXCS fields. This detection limit is comparable to the one attained by the most sensitive cluster surveys conducted with much larger X-ray telescopes. While evaluating the performance of EXSdetect, we also explored the impact of improved angular resolution and discuss the ideal properties of the next generation of X-ray survey missions. The Phyton code EXSdetect is available on the SXCS website http://adlibitum.oats.inaf.it/sxcs

  8. Robot-assisted segmental resection of tubal pregnancy followed by end-to-end reanastomosis for preserving tubal patency and fertility

    PubMed Central

    Park, Joo Hyun; Cho, SiHyun; Choi, Young Sik; Seo, Seok Kyo; Lee, Byung Seok

    2016-01-01

    Abstract The objective of this study was to evaluate whether robotic tubal reanastomosis after segmental resection of tubal pregnancy is a feasible means of preserving tubal integrity and natural fertility in those with compromised contralateral tubal condition. The study was performed at a university medical center in a retrospective manner where da Vinci robotic system-guided segmental resection of tubal ectopic mass followed by reanastomosis was performed to salvage tubal patency and fertility in those with a single viable fallopian tube. Of the 17 patients with tubal pregnancies that were selected, 14 patients with successful tubal segmental resection and reanastomosis were followed up. The reproducibility of anastomosis success and cumulative pregnancy rates of up to 24 months were analyzed. Patient mean age was 28.88 ± 4.74 years, mean amenorrheic period was 7.01 ± 1.57 weeks and mean human chorionic gonadotropin (hCG) level was 9289.00 ± 7510.00 mIU/mL. The overall intraoperative cancellation rate due to unfavorable positioning or size of the tubal mass was 17.65% (3/17), which was converted to either salpingectomy or milking of ectopic mass. Of the 14 attempted, anastomosis for all 14 cases was successful, with 1 anastomotic leakage. One patient wishing to postpone pregnancy and 2 patients where patency of the contralateral tube was confirmed during the operation, were excluded from the pregnancy outcome analysis. Cumulative pregnancy rate was 63.64% (7/11), with 3 (27.27%) ongoing pregnancies, 3 (27.27%) livebirths, and 1 missed abortion at 24 months. During the follow-up, hysterosalpingography (HSG) was performed at 6 months for those who consented, and all 10 fallopian tubes tested were patent. No subsequent tubal pregnancies occurred in the reananstomosed tube for up to a period 24 months. For patients with absent or defective contralateral tubal function, da Vinci-guided reanastomosis after segmental resection of tubal pregnancy is feasible for salvaging tubal patency and fertility. PMID:27741101

  9. Nasal reconstruction after epithelioma.

    PubMed

    Rodríguez-Camps, S

    2001-01-01

    In this paper we present our procedure for the treatment, histopathological diagnosis, and resection of skin cancer in the nasal pyramid and its subsequent reconstruction. Because we are dealing with the most important anatomical feature of the face our goal is an aesthetic reconstruction [2,4] according to the anatomical subunits criterion of Burget [3]. First, a histopathological diagnosis is made to determine the nature of the tumor. Then, we proceed with the resection according to the Mohs Micrographic Surgery [1,5,7]. Then we begin with the first step of the nasal reconstruction.

  10. Efficient reconstruction method for ground layer adaptive optics with mixed natural and laser guide stars.

    PubMed

    Wagner, Roland; Helin, Tapio; Obereder, Andreas; Ramlau, Ronny

    2016-02-20

    The imaging quality of modern ground-based telescopes such as the planned European Extremely Large Telescope is affected by atmospheric turbulence. In consequence, they heavily depend on stable and high-performance adaptive optics (AO) systems. Using measurements of incoming light from guide stars, an AO system compensates for the effects of turbulence by adjusting so-called deformable mirror(s) (DMs) in real time. In this paper, we introduce a novel reconstruction method for ground layer adaptive optics. In the literature, a common approach to this problem is to use Bayesian inference in order to model the specific noise structure appearing due to spot elongation. This approach leads to large coupled systems with high computational effort. Recently, fast solvers of linear order, i.e., with computational complexity O(n), where n is the number of DM actuators, have emerged. However, the quality of such methods typically degrades in low flux conditions. Our key contribution is to achieve the high quality of the standard Bayesian approach while at the same time maintaining the linear order speed of the recent solvers. Our method is based on performing a separate preprocessing step before applying the cumulative reconstructor (CuReD). The efficiency and performance of the new reconstructor are demonstrated using the OCTOPUS, the official end-to-end simulation environment of the ESO for extremely large telescopes. For more specific simulations we also use the MOST toolbox.

  11. Performance comparison of wavefront reconstruction and control algorithms for Extremely Large Telescopes.

    PubMed

    Montilla, I; Béchet, C; Le Louarn, M; Reyes, M; Tallon, M

    2010-11-01

    Extremely Large Telescopes (ELTs) are very challenging with respect to their adaptive optics (AO) requirements. Their diameters and the specifications required by the astronomical science for which they are being designed imply a huge increment in the number of degrees of freedom in the deformable mirrors. Faster algorithms are needed to implement the real-time reconstruction and control in AO at the required speed. We present the results of a study of the AO correction performance of three different algorithms applied to the case of a 42-m ELT: one considered as a reference, the matrix-vector multiply (MVM) algorithm; and two considered fast, the fractal iterative method (FrIM) and the Fourier transform reconstructor (FTR). The MVM and the FrIM both provide a maximum a posteriori estimation, while the FTR provides a least-squares one. The algorithms are tested on the European Southern Observatory (ESO) end-to-end simulator, OCTOPUS. The performance is compared using a natural guide star single-conjugate adaptive optics configuration. The results demonstrate that the methods have similar performance in a large variety of simulated conditions. However, with respect to system misregistrations, the fast algorithms demonstrate an interesting robustness.

  12. Hepatic artery reconstruction in living donor liver transplantation from the microsurgeon's point of view.

    PubMed

    Furuta, S; Ikegami, T; Nakazawa, Y; Hashikura, Y; Matsunami, H; Kawasaki, S; Makuuchi, M

    1997-07-01

    Microvascular surgery for the reconstruction of the hepatic artery in living donor liver transplantation is discussed from the microsurgeon's point of view. A refined operative procedure to improve the safety of the anastomosis is described. In living donor liver transplantation, the hepatic artery of the graft is short and small, the operative site is deep and mobile, and the anatomic arrangement of the graft left hepatic artery may differ from that of the recipient's dilated hepatic artery. To create a safe anastomosis under these conditions, recipient arteries that were slightly smaller than the graft artery were dissected. Without the size discrepancy, and end-to-end anastomosis could be created. Some refinements to create a good operative field made the anastomosis easy. The apparatus and techniques used in free-flap transfer facilitated a clean anastomosis. We anastomosed 44 arteries in 40 patients undergoing living donor liver transplantation using microsurgical techniques. Neither a decrease in the arterial blood flow nor hepatic artery thrombosis was noted. The refined operative procedure we describe in this report can be used to overcome the problems associated with the hepatic artery anastomosis in living donor liver transplantation.

  13. Craniofacial reconstruction - series (image)

    MedlinePlus

    Patients requiring craniofacial reconstruction have: birth defects (such as hypertelorism, Crouzon's disease, Apert's syndrome) injuries to the head, face, or jaws (maxillofacial) tumors deformities caused by treatments of tumors

  14. Breast Reconstruction Options

    MedlinePlus

    ... surgery to allow for better healing. You need radiation therapy. Many doctors recommend that women not have immediate ... al. Ischemic complications in pedicle, free, and muscle sparing transverse rectus abdominis myocutaneous flaps for breast reconstruction. ...

  15. Reconstruction of Mandibular Defects

    PubMed Central

    Chim, Harvey; Salgado, Christopher J.; Mardini, Samir; Chen, Hung-Chi

    2010-01-01

    Defects requiring reconstruction in the mandible are commonly encountered and may result from resection of benign or malignant lesions, trauma, or osteoradionecrosis. Mandibular defects can be classified according to location and extent, as well as involvement of mucosa, skin, and tongue. Vascularized bone flaps, in general, provide the best functional and aesthetic outcome, with the fibula flap remaining the gold standard for mandible reconstruction. In this review, we discuss classification and approach to reconstruction of mandibular defects. We also elaborate upon four commonly used free osteocutaneous flaps, inclusive of fibula, iliac crest, scapula, and radial forearm. Finally, we discuss indications and use of osseointegrated implants as well as recent advances in mandibular reconstruction. PMID:22550439

  16. Breast reconstruction - implants

    MedlinePlus

    ... cosmetic surgery after breast cancer can improve your sense of well-being and your quality of life. Alternative Names Breast implants surgery References Roehl KR, Wilhelmi BJ, Phillips LG. Breast reconstruction. ...

  17. Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry

    NASA Astrophysics Data System (ADS)

    Kirillov, V. A.; Dubovsky, S. V.

    2016-07-01

    Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.

  18. Fast Image Reconstruction with L2-Regularization

    PubMed Central

    Bilgic, Berkin; Chatnuntawech, Itthi; Fan, Audrey P.; Setsompop, Kawin; Cauley, Stephen F.; Wald, Lawrence L.; Adalsteinsson, Elfar

    2014-01-01

    Purpose We introduce L2-regularized reconstruction algorithms with closed-form solutions that achieve dramatic computational speed-up relative to state of the art L1- and L2-based iterative algorithms while maintaining similar image quality for various applications in MRI reconstruction. Materials and Methods We compare fast L2-based methods to state of the art algorithms employing iterative L1- and L2-regularization in numerical phantom and in vivo data in three applications; 1) Fast Quantitative Susceptibility Mapping (QSD), 2) Lipid artifact suppression in Magnetic Resonance Spectroscopic Imaging (MRSI), and 3) Diffusion Spectrum Imaging (DSI). In all cases, proposed L2-based methods are compared with the state of the art algorithms, and two to three orders of magnitude speed up is demonstrated with similar reconstruction quality. Results The closed-form solution developed for regularized QSM allows processing of a 3D volume under 5 seconds, the proposed lipid suppression algorithm takes under 1 second to reconstruct single-slice MRSI data, while the PCA based DSI algorithm estimates diffusion propagators from undersampled q-space for a single slice under 30 seconds, all running in Matlab using a standard workstation. Conclusion For the applications considered herein, closed-form L2-regularization can be a faster alternative to its iterative counterpart or L1-based iterative algorithms, without compromising image quality. PMID:24395184

  19. Accelerating Spectrum Sharing Technologies

    SciTech Connect

    Juan D. Deaton; Lynda L. Brighton; Rangam Subramanian; Hussein Moradi; Jose Loera

    2013-09-01

    Spectrum sharing potentially holds the promise of solving the emerging spectrum crisis. However, technology innovators face the conundrum of developing spectrum sharing technologies without the ability to experiment and test with real incumbent systems. Interference with operational incumbents can prevent critical services, and the cost of deploying and operating an incumbent system can be prohibitive. Thus, the lack of incumbent systems and frequency authorization for technology incubation and demonstration has stymied spectrum sharing research. To this end, industry, academia, and regulators all require a test facility for validating hypotheses and demonstrating functionality without affecting operational incumbent systems. This article proposes a four-phase program supported by our spectrum accountability architecture. We propose that our comprehensive experimentation and testing approach for technology incubation and demonstration will accelerate the development of spectrum sharing technologies.

  20. Chandler wobble excitation reconstruction and analysis

    NASA Astrophysics Data System (ADS)

    Zotov, Leonid

    2010-05-01

    Different methods of geodetic excitation reconstruction from observations of the polar motion are compared. Among them Wilson-Jeffreys filter, Tikhonov regularization, Panteleev corrective smoothing. Reconstruction of Chandler excitation is an inverse problem, aggravated by the strong annual oscillation, which is nearby in frequency band. Special attempts to filter annual oscillation out were undertaken, among them the harmonic model subtraction, Singular Spectrum Analysis (SSA) and Panteleev smoothing. Obtained results compared one with another and with geophysical excitations, such as atmospheric and oceanic angular momentum, El Nino event, solar and lunar tides. Amplitude and phase correlation analysis was performed. Phase change of the Chandler oscillation in the 30-th of the XX century found a partial explanation. This work is supported by grant of the President of Russia MK-4234.2009.5

  1. Lining in nasal reconstruction.

    PubMed

    Haack, Sebastian; Fischer, Helmut; Gubisch, Wolfgang

    2014-06-01

    Restoring nasal lining is one of the essential parts during reconstruction of full-thickness defects of the nose. Without a sufficient nasal lining the whole reconstruction will fail. Nasal lining has to sufficiently cover the shaping subsurface framework. But in addition, lining must not compromise or even block nasal ventilation. This article demonstrates different possibilities of lining reconstruction. The use of composite grafts for small rim defects is described. The limits and technical components for application of skin grafts are discussed. Then the advantages and limitations of endonasal, perinasal, and hingeover flaps are demonstrated. Strategies to restore lining with one or two forehead flaps are presented. Finally, the possibilities and technical aspects to reconstruct nasal lining with a forearm flap are demonstrated. Technical details are explained by intraoperative pictures. Clinical cases are shown to illustrate the different approaches and should help to understand the process of decision making. It is concluded that although the lining cannot be seen after reconstruction of the cover it remains one of the key components for nasal reconstruction. When dealing with full-thickness nasal defects, there is no way to avoid learning how to restore nasal lining.

  2. Efficient holoscopy image reconstruction.

    PubMed

    Hillmann, Dierck; Franke, Gesa; Lührs, Christian; Koch, Peter; Hüttmann, Gereon

    2012-09-10

    Holoscopy is a tomographic imaging technique that combines digital holography and Fourier-domain optical coherence tomography (OCT) to gain tomograms with diffraction limited resolution and uniform sensitivity over several Rayleigh lengths. The lateral image information is calculated from the spatial interference pattern formed by light scattered from the sample and a reference beam. The depth information is obtained from the spectral dependence of the recorded digital holograms. Numerous digital holograms are acquired at different wavelengths and then reconstructed for a common plane in the sample. Afterwards standard Fourier-domain OCT signal processing achieves depth discrimination. Here we describe and demonstrate an optimized data reconstruction algorithm for holoscopy which is related to the inverse scattering reconstruction of wavelength-scanned full-field optical coherence tomography data. Instead of calculating a regularized pseudoinverse of the forward operator, the recorded optical fields are propagated back into the sample volume. In one processing step the high frequency components of the scattering potential are reconstructed on a non-equidistant grid in three-dimensional spatial frequency space. A Fourier transform yields an OCT equivalent image of the object structure. In contrast to the original holoscopy reconstruction with backpropagation and Fourier transform with respect to the wavenumber, the required processing time does neither depend on the confocal parameter nor on the depth of the volume. For an imaging NA of 0.14, the processing time was decreased by a factor of 15, at higher NA the gain in reconstruction speed may reach two orders of magnitude.

  3. Reconstruction of fiber grating refractive-index profiles from complex bragg reflection spectra.

    PubMed

    Huang, D W; Yang, C C

    1999-07-20

    Reconstruction of the refractive-index profiles of fiber gratings from their complex Bragg reflection spectra is experimentally demonstrated. The amplitude and phase of the complex reflection spectrum were measured with a balanced Michelson interferometer. By integrating the coupled-mode equations, we built the relationship between the complex coupling coefficient and the complex reflection spectrum as an iterative algorithm for reconstructing the index profile. This method is expected to be useful for reconstructing the index profiles of fiber gratings with any apodization, chirp, or dc structures. An apodized chirped grating and a uniform grating with a depression of index modulation were used to demonstrate the technique.

  4. Measuring the activity of a 51Cr neutrino source based on the gamma-radiation spectrum

    NASA Astrophysics Data System (ADS)

    Gorbachev, V. V.; Gavrin, V. N.; Ibragimova, T. V.; Kalikhov, A. V.; Malyshkin, Yu. M.; Shikhin, A. A.

    2015-12-01

    A technique for the measurement of activities of intense β sources by measuring the continuous gamma-radiation (internal bremsstrahlung) spectra is developed. A method for reconstructing the spectrum recorded by a germanium semiconductor detector is described. A method for the absolute measurement of the internal bremsstrahlung spectrum of 51Cr is presented.

  5. Image reconstruction algorithms with wavelet filtering for optoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gawali, S.; Leggio, L.; Broadway, C.; González, P.; Sánchez, M.; Rodríguez, S.; Lamela, H.

    2016-03-01

    Optoacoustic imaging (OAI) is a hybrid biomedical imaging modality based on the generation and detection of ultrasound by illuminating the target tissue by laser light. Typically, laser light in visible or near infrared spectrum is used as an excitation source. OAI is based on the implementation of image reconstruction algorithms using the spatial distribution of optical absorption in tissues. In this work, we apply a time-domain back-projection (BP) reconstruction algorithm and a wavelet filtering for point and line detection, respectively. A comparative study between point detection and integrated line detection has been carried out by evaluating their effects on the image reconstructed. Our results demonstrate that the back-projection algorithm proposed is efficient for reconstructing high-resolution images of absorbing spheres embedded in a non-absorbing medium when it is combined with the wavelet filtering.

  6. Radiation detector spectrum simulator

    DOEpatents

    Wolf, M.A.; Crowell, J.M.

    1985-04-09

    A small battery operated nuclear spectrum simulator having a noise source generates pulses with a Gaussian distribution of amplitudes. A switched dc bias circuit cooperating therewith to generate several nominal amplitudes of such pulses and a spectral distribution of pulses that closely simulates the spectrum produced by a radiation source such as Americium 241.

  7. Radiation detector spectrum simulator

    DOEpatents

    Wolf, Michael A.; Crowell, John M.

    1987-01-01

    A small battery operated nuclear spectrum simulator having a noise source nerates pulses with a Gaussian distribution of amplitudes. A switched dc bias circuit cooperating therewith generates several nominal amplitudes of such pulses and a spectral distribution of pulses that closely simulates the spectrum produced by a radiation source such as Americium 241.

  8. Fetal Alcohol Spectrum Disorder

    ERIC Educational Resources Information Center

    Caley, Linda M.; Kramer, Charlotte; Robinson, Luther K.

    2005-01-01

    Fetal alcohol spectrum disorder (FASD) is a serious and widespread problem in this country. Positioned within the community with links to children, families, and healthcare systems, school nurses are a critical element in the prevention and treatment of those affected by fetal alcohol spectrum disorder. Although most school nurses are familiar…

  9. The CMBR spectrum

    SciTech Connect

    Stebbins, A.

    1997-05-01

    Here we give an introduction to the observed spectrum of the Cosmic Microwave Background Radiation (CMBR) and discuss what can be learned about it. Particular attention will be given to how Compton scattering can distort the spectrum of the CMBR. An incomplete bibliography of relevant papers is also provided.

  10. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing.

    PubMed

    Menin, O H; Martinez, A S; Costa, A M

    2016-05-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present.

  11. Augmented Likelihood Image Reconstruction.

    PubMed

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  12. Lateral Abdominal Wall Reconstruction

    PubMed Central

    Baumann, Donald P.; Butler, Charles E.

    2012-01-01

    Lateral abdominal wall (LAW) defects can manifest as a flank hernias, myofascial laxity/bulges, or full-thickness defects. These defects are quite different from those in the anterior abdominal wall defects and the complexity and limited surgical options make repairing the LAW a challenge for the reconstructive surgeon. LAW reconstruction requires an understanding of the anatomy, physiologic forces, and the impact of deinnervation injury to design and perform successful reconstructions of hernia, bulge, and full-thickness defects. Reconstructive strategies must be tailored to address the inguinal ligament, retroperitoneum, chest wall, and diaphragm. Operative technique must focus on stabilization of the LAW to nonyielding points of fixation at the anatomic borders of the LAW far beyond the musculofascial borders of the defect itself. Thus, hernias, bulges, and full-thickness defects are approached in a similar fashion. Mesh reinforcement is uniformly required in lateral abdominal wall reconstruction. Inlay mesh placement with overlying myofascial coverage is preferred as a first-line option as is the case in anterior abdominal wall reconstruction. However, interposition bridging repairs are often performed as the surrounding myofascial tissue precludes a dual layered closure. The decision to place bioprosthetic or prosthetic mesh depends on surgeon preference, patient comorbidities, and clinical factors of the repair. Regardless of mesh type, the overlying soft tissue must provide stable cutaneous coverage and obliteration of dead space. In cases where the fasciocutaneous flaps surrounding the defect are inadequate for closure, regional pedicled flaps or free flaps are recruited to achieve stable soft tissue coverage. PMID:23372458

  13. Reconstructing the astronomical heritage

    NASA Astrophysics Data System (ADS)

    Planesas, Pere

    2011-06-01

    Studies of the astronomical heritage can deal with the ancient astronomical knowledge, traditions and myths, as well as with old instruments and observatories. It is urgent to work for their recovery, before they are definitely forgoten, lost or destroyed. On the cultural side, the Joint ALMA Observatory is sponsoring the study of the local cosmology and sky of the indigenous people living in the region where ALMA is currently being build. In the case of ancient instruments, several success stories already exist, the most recent one being the reconstruction of the Madrid 25ft Herschel telescope. Examples of notable instruments pending reconstruction are listed.

  14. Upper Eyelid Reconstruction.

    PubMed

    Espinoza, Gabriela Mabel; Prost, Angela Michelle

    2016-05-01

    Reconstruction of the upper eyelid is complicated because the eyelid must retain mobility, flexibility, function, and a suitable mucosal surface over the delicate cornea. Defects of the upper eyelid may be due to congenital defects or traumatic injury or follow oncologic resection. This article focuses on reconstruction due to loss of tissue. Multiple surgeries may be needed to reach the desired results, addressing loss of tissue and then loss of function. Each defect is unique and the laxity and availability of surrounding tissue vary. Knowing the most common techniques for repair assists surgeons in the multifaceted planning that takes place.

  15. Reconstruction of the Intranasal Lining.

    PubMed

    Zenga, Joseph; Chi, John J

    2017-02-01

    Reconstruction of full-thickness nasal defects has been the subject of surgical inquiry and innovation for over 2,000 years. The replacement of the internal nasal lining is a critical feature of complex nasal reconstruction. Successful reconstruction can prevent cicatricial contraction, external distortion, and internal stenosis. An array of reconstructive possibilities has been described, including cutaneous, mucosal, and fascial options. The challenge to the reconstructive surgeon is to select the repair that maximizes internal stability, while maintaining a patent nasal airway, minimizing morbidity, and meeting patient expectations. This article reviews the options available for the reconstruction of the intranasal lining.

  16. Reconstructing Progressive Education

    ERIC Educational Resources Information Center

    Kaplan, Andy

    2013-01-01

    The work of Colonel Francis W. Parker, the man whom Dewey called "the father of progressive education," provides a starting point for reconstructing the loose ambiguities of progressive education into a coherent social and educational philosophy. Although progressives have claimed their approach is more humane and sensitive to children, we need…

  17. IRIS Spectrum Line Plot

    NASA Video Gallery

    This video shows a line plot of the spectrum. The spectra here are shown for various locations on the Sun. The changes in the movie are caused by differing physical conditions in the locations. Cre...

  18. Quantum Spread Spectrum Communication

    SciTech Connect

    Humble, Travis S

    2010-01-01

    We demonstrate that spectral teleportation can coherently dilate the spectral probability amplitude of a single photon. In preserving the encoded quantum information, this variant of teleportation subsequently enables a form of quantum spread spectrum communication.

  19. Autism Spectrum Disorder (ASD)

    MedlinePlus

    ... essential data on ASD, search for factors that put children at risk for ASD and possible causes, ... United States to help identify factors that may put children at risk for autism spectrum disorder (ASD) ...

  20. Autism Spectrum Disorder (ASD)

    MedlinePlus

    ... NICHD Research Information Clinical Trials Resources and Publications Autism Spectrum Disorder (ASD): Condition Information Skip sharing on ... Restricted interests and repetitive behaviors Different people with autism can have different symptoms. For this reason, autism ...

  1. Comparison of spread spectrum and pulse signal excitation for split spectrum techniques composite imaging

    NASA Astrophysics Data System (ADS)

    Svilainis, L.; Kitov, S.; Rodríguez, A.; Vergara, L.; Dumbrava, V.; Chaziachmetovas, A.

    2012-12-01

    Ultrasonic imaging of composites was investigated. Glass and carbon fiber reinforced plastic produced by resin transfer molding and prepreg forming were analyzed. In some of the samples air bubbles were trapped during RTM (resin transfer molding) process and interlayer gaps were present in prepreg technology samples. One of the most expected techniques to apply in such case is the Split Spectrum processing. On the other hand such signals require specific processing to reliably reconstruct the temporal position of the defect reflection. Correlation processing can be used for signal compression or Wiener filtering can be applied for spectral content equalisation. Pulse signals are simple to generate, but lack the possibility to alter the signal's spectrum shape. Spread spectrum signals offer a powerful tool for signal energy over frequency band increase and resolution enhancement. CW (continuous wave) burst has high energy but lacks the bandwidth needed for SSP (spread spectrum processing). The aim of the investigation was to compare the performance of the above signals in case of composite imaging, when various Split Spectrum Processing techniques are used with preceding Wiener processing for spectral content compensation. Resulting composite signals and images obtained are presented. Structural noise removal performance was evaluated as Receiver Operating Characteristics (ROC).

  2. Preparing for Breast Reconstruction Surgery

    MedlinePlus

    ... Cancer Breast Reconstruction Surgery Preparing for Breast Reconstruction Surgery Your surgeon can help you know what to ... The plan for follow-up Costs Understanding your surgery costs Health insurance policies often cover most or ...

  3. Controversies in Parotid Defect Reconstruction.

    PubMed

    Tamplen, Matthew; Knott, P Daniel; Fritz, Michael A; Seth, Rahul

    2016-08-01

    Reconstruction of the parotid defect is a complex topic that encompasses restoration of both facial form and function. The reconstructive surgeon must consider facial contour, avoidance of Frey syndrome, skin coverage, tumor surveillance, potential adjuvant therapy, and facial reanimation when addressing parotid defects. With each defect there are several options within the reconstructive ladder, creating controversies regarding optimal management. This article describes surgical approaches to reconstruction of parotid defects, highlighting areas of controversy.

  4. Amplitude of primeval fluctuations from cosmological mass density reconstructions

    NASA Technical Reports Server (NTRS)

    Seljak, Uros; Bertschinger, Edmund

    1994-01-01

    We use the POTENT reconstruction of the mass density field in the nearby universe to estimate the amplitude of the density fluctuation power spectrum for various cosmological models. We find that sigma(sub 8) Omega(sub m sup 0.6) = 1.3(sub -0.3 sup +0.4), almost independently of the power spectrum. This value agrees well with the Cosmic Background Explorer (COBE) normalization for the standard cold dark matter model, while alternative models predict an excessive amplitude compared with COBE. Flat, low Omega(sub m) models and tilted models with spectral index n less than 0.8 are particularly discordant.

  5. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  6. Medial Patellofemoral Ligament Reconstruction

    PubMed Central

    Palacios, Jose Antonio; Yacuzzi, Carlos; Oñativia, Jose I.; Zicaro, Juan Pablo; Costa-Paz, Matias

    2017-01-01

    Objectives: Recurrent patellofemoral dislocation is usually a multifactorial pathology. Different surgical techniques have been described according to the etiology of dislocation. In absence of a severe malalignment or an anatomical patellofemoral dysplasia, reconstruction of Medial Patello-femoral Ligament (MPFL) can restore the normal tracking of the patella, avoiding lateral excursion. The purpose of this study was to evaluate clinical results and complications in patients who underwent a MPFL reconstruction. Methods: We retrospectively evaluated 19 patients who underwent an anatomic MPFL reconstruction using autologous semitendinosus graft between 2007 and 2012. Exclusion criteria were patients with less than three years of follow-up and those with an associated procedure such as distal realignment or trochleoplasty. Clinical outcomes were measured using Kujala score and return to sport rate. We registered the postoperative complications and recurrence rate. Results: Nine patients were men and 10 women with a mean age of 25 years. Average follow-up was 5.8 years. Nine patients (47.4%) returned to their previous sport level, 8 (42.1%) changed to another sport or decreased their level and 2 (10.5%) were unable to practice any sports at all. Kujala score improvement was from 62.8 preoperative to 88.8 postoperative. One patient decreased the Kujala score. Eighty-nine percent of patients were satisfied with their outcome. One patient had a patellar fracture and four developed an arthrofibrosis and required mobilization under anesthesia. No recurrences were registered. Conclusion: Isolated MPFL reconstruction for recurrent patellofemoral dislocation is an effective alternative in absence of severe malalignment or anatomical dysplasia. Although no recurrences where registered at minimum 3-year follow-up, almost half of the patients were not able to return to their previous sport level.

  7. Kinky tomographic reconstruction

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Bilisoly, R.L.

    1996-05-01

    We address the issue of how to make decisions about the degree of smoothness demanded of a flexible contour used to model the boundary of a 2D object. We demonstrate the use of a Bayesian approach to set the strength of the smoothness prior for a tomographic reconstruction problem. The Akaike Information Criterion is used to determine whether to allow a kink in the contour.

  8. Temporal Surface Reconstruction

    DTIC Science & Technology

    1991-05-03

    sequence of images. Structure information can be recovered from images through a number of visual mechanisms such as shading, motion and stereo . Image...stages which are repeated (continued on back) 14. SUBJECT TERMS (key words) 1S. NUMBER OF PAGES 3D reconstruction structure estimation temporal visio 149...such as shading, motion and stereo . Image information is commonly available in a time-continuous fashion and this work proposes a method for estimating

  9. World reconstruction in psychotherapy.

    PubMed

    Bergner, Raymond M

    2005-01-01

    The purpose of this article is to articulate how we, as psychotherapists, can transform the worlds of our clients. In part one of the article, the concept of "world," the dynamics of how worlds operate, and the clinically relevant notions of "problematic worlds" and "impossible worlds" are explicated. In part two, therapeutic recommendations for helping clients to reconstruct their worlds are presented, with special emphasis on problems of grief post-traumatic stress disorder, and the experience of meaninglessness.

  10. Enhancement of low-quality reconstructed digital hologram images based on frequency extrapolation of large objects under the diffraction limit

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Li, Weiliang; Zhao, Dongxue

    2016-06-01

    During the reconstruction of a digital hologram, the reconstructed image is usually degraded by speckle noise, which makes it hard to observe the original object pattern. In this paper, a new reconstructed image enhancement method is proposed, which first reduces the speckle noise using an adaptive Gaussian filter, then calculates the high frequencies that belong to the object pattern based on a frequency extrapolation strategy. The proposed frequency extrapolation first calculates the frequency spectrum of the Fourier-filtered image, which is originally reconstructed from the +1 order of the hologram, and then gives the initial parameters for an iterative solution. The analytic iteration is implemented by continuous gradient threshold convergence to estimate the image level and vertical gradient information. The predicted spectrum is acquired through the analytical iteration of the original spectrum and gradient spectrum analysis. Finally, the reconstructed spectrum of the restoration image is acquired from the synthetic correction of the original spectrum using the predicted gradient spectrum. We conducted our experiment very close to the diffraction limit and used low-quality equipment to prove the feasibility of our method. Detailed analysis and figure demonstrations are presented in the paper.

  11. Enhancement of low quality reconstructed digital hologram images based on frequency extrapolation of large objects under the diffraction limit

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Chen, Xiaohong; Yang, Chao

    2016-11-01

    During the reconstruction of a digital hologram, the reconstructed image is usually degraded by speckle noise, which makes it hard to observe the original object pattern. In this paper, a new reconstructed image enhancement method is proposed, which first reduces the speckle noise using an adaptive Gaussian filter, then calculates the high frequencies that belong to the object pattern based on a frequency extrapolation strategy. The proposed frequency extrapolation first calculates the frequency spectrum of the Fourier-filtered image, which is originally reconstructed from the +1 order of the hologram, and then gives the initial parameters for an iterative solution. The analytic iteration is implemented by continuous gradient threshold convergence to estimate the image level and vertical gradient information. The predicted spectrum is acquired through the analytical iteration of the original spectrum and gradient spectrum analysis. Finally, the reconstructed spectrum of the restoration image is acquired from the synthetic correction of the original spectrum using the predicted gradient spectrum. We conducted our experiment very close to the diffraction limit and used low quality equipment to prove the feasibility of our method. Detailed analysis and figure demonstrations are presented in the paper.

  12. Input reconstruction of chaos sensors.

    PubMed

    Yu, Dongchuan; Liu, Fang; Lai, Pik-Yin

    2008-06-01

    Although the sensitivity of sensors can be significantly enhanced using chaotic dynamics due to its extremely sensitive dependence on initial conditions and parameters, how to reconstruct the measured signal from the distorted sensor response becomes challenging. In this paper we suggest an effective method to reconstruct the measured signal from the distorted (chaotic) response of chaos sensors. This measurement signal reconstruction method applies the neural network techniques for system structure identification and therefore does not require the precise information of the sensor's dynamics. We discuss also how to improve the robustness of reconstruction. Some examples are presented to illustrate the measurement signal reconstruction method suggested.

  13. Epiglottic reconstruction and subtotal laryngectomy.

    PubMed

    Schechter, G L

    1983-06-01

    Vertical hemilaryngectomy has been expanded aggressively in recent years so that, in some cases, the term subtotal laryngectomy would be more appropriate. Reconstruction after these extended resections is a problem. Intraluminal stenting has not been successful in cases where resection has been aggressive. The resulting lumen is inadequate. As a means of overcoming this problem, the epiglottic reconstruction procedure has been promoted. This paper presents experiences with 12 patients who underwent epiglottic reconstruction after subtotal laryngectomy. Indications, anatomic details, and overall results using this reconstructive technique are outlined. It is the conclusion of the author that epiglottic reconstruction is an effective procedure for preservation of function after subtotal laryngectomy.

  14. Scapholunate Ligament Reconstruction

    PubMed Central

    Ross, Mark; Loveridge, Jeremy; Cutbush, Kenneth; Couzens, Greg

    2013-01-01

    Background Scapholunate reconstruction poses a challenge to orthopedic surgeons. Materials and Methods Prospective cohort. Description of Technique Our technique for scapholunate (SL) reconstruction involves ligament reconstruction utilizing a portion of the flexor carpi radialis tendon rerouted via transosseous tunnels across the scaphoid, lunate, and triquetrum (scapholunotriquetral tenodesis). The tendon graft is secured with interference screw fixation into the triquetrum. The philosophy of this new technique is to reduce subluxation and maintain the relationship between scaphoid and lunate by placing a graft through the center of the SL articulation. This graft is then tensioned by passing it centrally through the lunate and triquetrum and secured using an interference screw in the triquetrum. Secondary stabilizers, including the dorsal intercarpal ligament, are then augmented by passing the graft back to the scaphoid, crossing from the triquetrum over the proximal capitate. This further reinforces the translational relationship between the scaphoid and the triquetrum and, therefore, augments stability of the SL articulation. Results We have utilized this technique successfully in over 40 patients since 2009. We report on a prospective consecutive series of 11 patients with over 12 months follow-up (range 12 to 24 months) demonstrating good early radiological and clinical outcomes. Conclusions In developing this technique, we aimed to take the best features of previously described techniques and address the perceived shortcomings of each. We believe there are several benefits of our technique. Moreover, few other techniques address as many of the aspects of chronic SL instability as our technique does. PMID:24436802

  15. Stepwise method based on Wiener estimation for spectral reconstruction in spectroscopic Raman imaging.

    PubMed

    Chen, Shuo; Wang, Gang; Cui, Xiaoyu; Liu, Quan

    2017-01-23

    Raman spectroscopy has demonstrated great potential in biomedical applications. However, spectroscopic Raman imaging is limited in the investigation of fast changing phenomena because of slow data acquisition. Our previous studies have indicated that spectroscopic Raman imaging can be significantly sped up using the approach of narrow-band imaging followed by spectral reconstruction. A multi-channel system was built to demonstrate the feasibility of fast wide-field spectroscopic Raman imaging using the approach of simultaneous narrow-band image acquisition followed by spectral reconstruction based on Wiener estimation in phantoms. To further improve the accuracy of reconstructed Raman spectra, we propose a stepwise spectral reconstruction method in this study, which can be combined with the earlier developed sequential weighted Wiener estimation to improve spectral reconstruction accuracy. The stepwise spectral reconstruction method first reconstructs the fluorescence background spectrum from narrow-band measurements and then the pure Raman narrow-band measurements can be estimated by subtracting the estimated fluorescence background from the overall narrow-band measurements. Thereafter, the pure Raman spectrum can be reconstructed from the estimated pure Raman narrow-band measurements. The result indicates that the stepwise spectral reconstruction method can improve spectral reconstruction accuracy significantly when combined with sequential weighted Wiener estimation, compared with the traditional Wiener estimation. In addition, qualitatively accurate cell Raman spectra were successfully reconstructed using the stepwise spectral reconstruction method from the narrow-band measurements acquired by a four-channel wide-field Raman spectroscopic imaging system. This method can potentially facilitate the adoption of spectroscopic Raman imaging to the investigation of fast changing phenomena.

  16. Broad spectrum solar cell

    DOEpatents

    Walukiewicz, Wladyslaw; Yu, Kin Man; Wu, Junqiao; Schaff, William J.

    2007-05-15

    An alloy having a large band gap range is used in a multijunction solar cell to enhance utilization of the solar energy spectrum. In one embodiment, the alloy is In.sub.1-xGa.sub.xN having an energy bandgap range of approximately 0.7 eV to 3.4 eV, providing a good match to the solar energy spectrum. Multiple junctions having different bandgaps are stacked to form a solar cell. Each junction may have different bandgaps (realized by varying the alloy composition), and therefore be responsive to different parts of the spectrum. The junctions are stacked in such a manner that some bands of light pass through upper junctions to lower junctions that are responsive to such bands.

  17. An unusual meteor spectrum

    NASA Technical Reports Server (NTRS)

    Cook, A. F.; Hemenway, C. L.; Millman, P. M.; Swider, A.

    1973-01-01

    An extraordinary spectrum of a meteor at a velocity of about 18.5 + or - 1.0 km/s was observed with an image orthicon camera. The radiant of the meteor was at an altitude of about 49 deg. It was first seen showing a yellow red continuous spectrum alone at a height of 137 + or - 8 km which is ascribed to the first positive group of nitrogen bands. After the meteor had descended to 116 + or - 6 km above sea level it brightened rapidly from its previous threshold brightness into a uniform continuum, the D-line of neutral sodium appeared, and at height 105 + or - 5 km all the other lines of the spectrum also appeared. The continuum remained dominant to the end. Water of hydration and entrained carbon flakes of characteristic dimension about 0.2 micron or less are proposed as constituents of the meteoroid to explain these phenomena.

  18. Posteromedial Corner Reconstruction

    PubMed Central

    Ferrer, Gonzalo; Leon, Agustín; Wirth, Hans; Mena, Adolfo; Tuca, María José; Espinoza, Gonzalo

    2017-01-01

    Objective: Report the experience, after 1-year follow-up, of 30 patients who underwent anatomical knee reconstruction of posteromedial corner (PMC) injuries, using La Prade´s Technique. Methods: Retrospective cohort study of 30 consecutive patients with PMC injuries operated between November 2010 and May 2014 by the same surgical team. Inclusion criteria: patients with clinical presentation and images (stress radiographs and MRI) compatible with PMC injury, who maintained a grade III chronic instability in spite of at least 3 months of orthopedic treatment, who were reconstructed using La Prade’s anatomical technique, and completed at least 12 months of follow-up. Exclusion criteria: discordance between clinical and image studies, grade I or II medial instability, and surgery performed through a different technique. Data was collected by reviewing the electronic files and images. Functional scores (IKDC and Lysholm) were applied and registered in the preoperative evaluation, and then 6 and 12 months after surgery. Results: Thirty patients (28 men and 2 women) met the inclusion criteria. Mean age was 43 years (24-69). The vast majority (28 patients) had a high-energy mechanism of injury. Twenty patients were diagnosed in the acute setting, while 10 had a delayed diagnosis after poor results of concomitant ligament reconstructions. With the exception of 2 patients, who presented with isolated PMC injury, the majority had associated injuries as detailed: 11 cases had PMC + anterior cruciate ligament (ACL) injury, 3 patients had PMC + posterior cruciate ligament (PCL) injury, 3 patients had PMC + meniscal tears, 9 patients had PMC + ACL + PCL injuries, and there were 2 cases of PMC + ACL + PCL + lateral collateral ligament injuries. Mean time for PMC reconstruction surgery was 5 months (range 2-32). Lysholm and IKDC scores were 18,2 (2-69) and 24,3 (9,2-52,9) respectively in the preoperative setting, improving to 76,7 (44-94) and 70,7 (36,8-95,4) after 1-year follow

  19. NREL Spectrum of Innovation

    ScienceCinema

    None

    2016-07-12

    There are many voices calling for a future of abundant clean energy. The choices are difficult and the challenges daunting. How will we get there? The National Renewable Energy Laboratory integrates the entire spectrum of innovation including fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. The innovation process at NREL is interdependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.

  20. Spectrum of wormholes

    SciTech Connect

    Hawking, S.W. Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Silver Street, Cambridge CB3 9EW ); Page, D.N. Department of Physics, Pennsylvania State University, University Park, PA Theoretical Physics Institute, Department of Physics, University of Alberta, Edmonton, AB )

    1990-10-15

    Wormholes have been studied mainly in the semiclassical approximation as solutions of the classical Euclidean field equations. However, such solutions are rather special, and exist only for certain kinds of matter. On the other hand, one can represent wormholes in a more general manner as solutions of the Wheeler-DeWitt equation with appropriate boundary conditions. Minisuperspace models with massless minimal or conformal scalar fields have a discrete spectrum of these solutions. The Giddings-Strominger instanton solution corresponds to a sum of an infinite number of these solutions. Minisuperspace models with a massive scalar field also appear to have a discrete spectrum of such solutions, whose asymptotic form is given.

  1. Improving VHF Spectrum Utilization

    NASA Technical Reports Server (NTRS)

    Andro, Monty; Orr, Richard; Foore, Larry; Sheehe, Charles; Freeman, Mark; Nguyen, Thanh; Bretmersky, Steven; Laberge, Chuck; Buchanan, David

    2004-01-01

    Limited VHF communications system capacity and increasing air traffic results in congestion of the aviation VHF spectrum. The voice communications errors and delayed channel access create system congestion and air traffic delays. Regulatory subdivision of bands for specific functions limits flexibility in the frequency usage. The objective of this viewgraph presentation is to identify near/mid/far term technologies to improve the performance and spectrum efficiency of current and emerging VHF communications systems. Select technologies with the highest potential, perform research and development to bring them to implementation stage.

  2. NREL Spectrum of Innovation

    SciTech Connect

    2011-01-01

    There are many voices calling for a future of abundant clean energy. The choices are difficult and the challenges daunting. How will we get there? The National Renewable Energy Laboratory integrates the entire spectrum of innovation including fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. The innovation process at NREL is interdependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.

  3. Flat-spectrum speech.

    PubMed

    Schroeder, M R; Strube, H W

    1986-05-01

    Flat-spectrum stimuli, consisting of many equal-amplitude harmonics, produce timbre sensations that can depend strongly on the phase angles of the individual harmonics. For fundamental frequencies in the human pitch range, many realizable timbres have vowel-like perceptual qualities. This observation suggests the possibility of constructing intelligible voiced speech signals that have flat-amplitude spectra. This paper describes a successful experiment of creating several different diphthongs by judicious choice of the phase angles of a flat-spectrum waveform. A possible explanation of the observed vowel timbres lies in the dependence of the short-time amplitude spectra on phase changes.

  4. Microwave assisted reconstruction of optical interferograms for distributed fiber optic sensing.

    PubMed

    Huang, Jie; Hua, Lei; Lan, Xinwei; Wei, Tao; Xiao, Hai

    2013-07-29

    This paper reports a distributed fiber optic sensing technique through microwave assisted separation and reconstruction of optical interferograms in spectrum domain. The approach involves sending a microwave-modulated optical signal through cascaded fiber optic interferometers. The microwave signal was used to resolve the position and reflectivity of each sensor along the optical fiber. By sweeping the optical wavelength and detecting the modulation signal, the optical spectrum of each sensor can be reconstructed. Three cascaded fiber optic extrinsic Fabry-Perot interferometric sensors were used to prove the concept. Their microwave-reconstructed interferogram matched well with those recorded individually using an optical spectrum analyzer. The application in distributed strain measurement has also been demonstrated.

  5. Spread Spectrum Frequency Management

    DTIC Science & Technology

    1989-06-01

    theoretically predicted behavior of the new system. Thp experimental program must include field tests in real propagation and interference environments...technological developments and without adequate overall knowledge of propagation characteristics or of other important uses that might require... propagation characteristics at the different frequency levels. The history of major spectrum allocations is then a 7 record of decisions primarily

  6. Stellar Spectrum Synthesizer

    ERIC Educational Resources Information Center

    Landegren, G. F.

    1975-01-01

    Describes a device which employs two diffraction gratings and three or four simple lenses to produce arbitrary absorption or emission spectra that may be doppler shifted and spectroscopically examined by students some distance away. It may be regarded as a sort of artificial star whose spectrum may be analyzed as an undergraduate laboratory…

  7. The Frequency Spectrum Radio.

    ERIC Educational Resources Information Center

    Howkins, John, Ed.

    1979-01-01

    This journal issue focuses on the frequency spectrum used in radio communication and on the World Administrative Radio Conference, sponsored by the International Telecommunication Union, held in Geneva, Switzerland, in the fall of 1979. Articles describe the World Administrative Radio Conference as the most important radio communication conference…

  8. Battlefield spectrum management

    NASA Astrophysics Data System (ADS)

    Sivakumar, C.

    1997-06-01

    Modern tactical communications systems rely on radios to support network and user connectivity. One of the challenges for network planners and managers is to make best use of scarce and vulnerable frequency spectrum resources to support the communication needs of war fighters. With the wide variety of Iris radio types typically to be deployed in the battlefield (ranging from high frequency to super high frequency), a comprehensive suite of tools is necessary to ensure that frequency interference is kept minimum. Without a sophisticated frequency spectrum management system, the most advanced tactical communications systems could be rendered useless, jeopardizing human life and national security. For these reasons, it is important to develop an Iris wide battlefield spectrum management capability that takes full advantage of current frequency spectrum management research and development (R&D), related tools, and supporting technology for assigning frequencies. This session briefly describes various assignment strategies being adopted in the Iris BFSM for overcoming cosite/collocated/farsite interferences along with the propagation models [from high frequency (HF) to super high frequency (SHF)] used for the assignment of frequencies. Also a brief thread outlining the process for generating frequency allocation/assignment request and analysis of frequency interference is discussed.

  9. Measuring the activity of a {sup 51}Cr neutrino source based on the gamma-radiation spectrum

    SciTech Connect

    Gorbachev, V. V. Gavrin, V. N.; Ibragimova, T. V.; Kalikhov, A. V.; Malyshkin, Yu. M.; Shikhin, A. A.

    2015-12-15

    A technique for the measurement of activities of intense β sources by measuring the continuous gamma-radiation (internal bremsstrahlung) spectra is developed. A method for reconstructing the spectrum recorded by a germanium semiconductor detector is described. A method for the absolute measurement of the internal bremsstrahlung spectrum of {sup 51}Cr is presented.

  10. Reconstructing vanished ocean basins

    NASA Astrophysics Data System (ADS)

    Müller, D.; Sdrolias, M.; Gaina, C.

    2006-05-01

    The large-scale patterns of mantle convection are mainly dependent on the history of subduction. Therefore some of the primary constraints for subduction models are given by of the location of subduction zones through time, and of the convergence vectors and age of subducted lithosphere. This requires the complete reconstruction of ocean floor through time, including the main ocean basins, back-arc basins, and now subducted ocean crust, and tying these kinematic models to geodynamic simulations. We reconstruct paleo- oceans by creating "synthetic plates", the locations and geometry of which is established on the basis of preserved ocean crust (magnetic lineations and fracture zones), geological data, paleogeography, and the rules of plate tectonics. We use a merged moving hotspot (Late Cretaceous-present) and palaeomagnetic/fixed hotspot (Early Cretaceous) reference frame, coupled with reconstructed spreading histories of the Pacific, Phoenix and Farallon plates and the plates involved in the Tethys oceanic domain. Based on this approach we have created a set of global oceanic paleo-isochrons and paleo-oceanic age grids. The grids also provide the first complete global set of paleo-basement depth maps, including now subducted ocean floor, for the last 130 million years based on a depth-age relationship. We show that the mid-Cretaceous sealevel highstand was primarily caused by two main factors: (1) the "supercontinent breakup effect", which resulted in the creation of the mid-Atlantic and Indian Ocean ridges at the expense of subducting old ocean floor in the Tethys and (2) by a changing age-area distribution of Pacific ocean floor through time, resulting from the subduction of the Pacific-Izanagi, Pacific-Phoenix and Pacific-Farallon ridges. These grids provide model constraints for subduction dynamics through time and represent a framework for backtracking biogeographic and sediment data from ocean drilling and for constraining the opening/closing of oceanic

  11. [EMD Time-Frequency Analysis of Raman Spectrum and NIR].

    PubMed

    Zhao, Xiao-yu; Fang, Yi-ming; Tan, Feng; Tong, Liang; Zhai, Zhe

    2016-02-01

    This paper analyzes the Raman spectrum and Near Infrared Spectrum (NIR) with time-frequency method. The empirical mode decomposition spectrum becomes intrinsic mode functions, which the proportion calculation reveals the Raman spectral energy is uniform distributed in each component, while the NIR's low order intrinsic mode functions only undertakes fewer primary spectroscopic effective information. Both the real spectrum and numerical experiments show that the empirical mode decomposition (EMD) regard Raman spectrum as the amplitude-modulated signal, which possessed with high frequency adsorption property; and EMD regards NIR as the frequency-modulated signal, which could be preferably realized high frequency narrow-band demodulation during first-order intrinsic mode functions. The first-order intrinsic mode functions Hilbert transform reveals that during the period of empirical mode decomposes Raman spectrum, modal aliasing happened. Through further analysis of corn leaf's NIR in time-frequency domain, after EMD, the first and second orders components of low energy are cut off, and reconstruct spectral signal by using the remaining intrinsic mode functions, the root-mean-square error is 1.001 1, and the correlation coefficient is 0.981 3, both of these two indexes indicated higher accuracy in re-construction; the decomposition trend term indicates the absorbency is ascending along with the decreasing to wave length in the near-infrared light wave band; and the Hilbert transform of characteristic modal component displays, 657 cm⁻¹ is the specific frequency by the corn leaf stress spectrum, which could be regarded as characteristic frequency for identification.

  12. Reconstructing the Universe

    SciTech Connect

    Ambjoern, J.; Jurkiewicz, J.; Loll, R.

    2005-09-15

    We provide detailed evidence for the claim that nonperturbative quantum gravity, defined through state sums of causal triangulated geometries, possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale, and at the same time provides a nontrivial consistency check of the method of causal dynamical triangulations. A closer look at the quantum geometry reveals a number of highly nonclassical aspects, including a dynamical reduction of spacetime to two dimensions on short scales and a fractal structure of slices of constant time.

  13. Penile surgery and reconstruction.

    PubMed

    Perovic, Sava V; Djordjevic, Miroslav L J; Kekic, Zoran K; Djakovic, Nenad G

    2002-05-01

    This review will highlight recent advances in the field of penile reconstructive surgery in the paediatric and adult population. It is based on the work published during the year 2001. Besides the anatomical and histological studies of the penis, major contributions have been described in congenital and acquired penile anomalies. Also, a few new techniques and modifications of old procedures are described in order to improve the final functional and aesthetic outcome. The techniques for penile enlargement present a trend in the new millennium, but are still at the stage of investigation.

  14. Photometric Lunar Surface Reconstruction

    NASA Technical Reports Server (NTRS)

    Nefian, Ara V.; Alexandrov, Oleg; Morattlo, Zachary; Kim, Taemin; Beyer, Ross A.

    2013-01-01

    Accurate photometric reconstruction of the Lunar surface is important in the context of upcoming NASA robotic missions to the Moon and in giving a more accurate understanding of the Lunar soil composition. This paper describes a novel approach for joint estimation of Lunar albedo, camera exposure time, and photometric parameters that utilizes an accurate Lunar-Lambertian reflectance model and previously derived Lunar topography of the area visualized during the Apollo missions. The method introduced here is used in creating the largest Lunar albedo map (16% of the Lunar surface) at the resolution of 10 meters/pixel.

  15. Evolutionary tree reconstruction

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Kanefsky, Bob

    1990-01-01

    It is described how Minimum Description Length (MDL) can be applied to the problem of DNA and protein evolutionary tree reconstruction. If there is a set of mutations that transform a common ancestor into a set of the known sequences, and this description is shorter than the information to encode the known sequences directly, then strong evidence for an evolutionary relationship has been found. A heuristic algorithm is described that searches for the simplest tree (smallest MDL) that finds close to optimal trees on the test data. Various ways of extending the MDL theory to more complex evolutionary relationships are discussed.

  16. Reconstructing the Antikythera Mechanism

    NASA Astrophysics Data System (ADS)

    Freeth, Tony

    The Antikythera Mechanism is a geared astronomical calculating machine from ancient Greece. The extraordinary nature of this device has become even more apparent in recent years as a result of research under the aegis of the Antikythera Mechanism Research Project (AMRP) - an international collaboration of scientists, historians, museum staff, engineers, and imaging specialists. Though many questions still remain, we may now be close to reconstructing the complete machine. As a technological artifact, it is unique in the ancient world. Its brilliant design conception means that it is a landmark in the history of science and technology.

  17. X-ray spectrum estimation from transmission measurements by an exponential of a polynomial model

    NASA Astrophysics Data System (ADS)

    Perkhounkov, Boris; Stec, Jessika; Sidky, Emil Y.; Pan, Xiaochuan

    2016-04-01

    There has been much recent research effort directed toward spectral computed tomography (CT). An important step in realizing spectral CT is determining the spectral response of the scanning system so that the relation between material thicknesses and X-ray transmission intensity is known. We propose a few parameter spectrum model that can accurately model the X-ray transmission curves and has a form which is amenable to simultaneous spectral CT image reconstruction and CT system spectrum calibration. While the goal is to eventually realize the simultaneous image reconstruction/spectrum estimation algorithm, in this work we investigate the effectiveness of the model on spectrum estimation from simulated transmission measurements through known thicknesses of known materials. The simulated transmission measurements employ a typical X-ray spectrum used for CT and contain noise due to the randomness in detecting finite numbers of photons. The proposed model writes the X-ray spectrum as the exponential of a polynomial (EP) expansion. The model parameters are obtained by use of a standard software implementation of the Nelder-Mead simplex algorithm. The performance of the model is measured by the relative error between the predicted and simulated transmission curves. The estimated spectrum is also compared with the model X-ray spectrum. For reference, we also employ a polynomial (P) spectrum model and show performance relative to the proposed EP model.

  18. Testing for isometry during reconstruction of the posterior cruciate ligament. Anatomic and biomechanical considerations.

    PubMed

    Covey, D C; Sapega, A A; Sherman, G M

    1996-01-01

    The change in the distance of linear separation between each pair of osseous fiber attachment sites of the posterior cruciate ligaments was measured and plotted as a function of the knee flexion angle from 0 degree to 120 degrees. Data were collected under four sequential test conditions that had in common quadriceps relaxation, absence of tibial rotation forces, and horizontal femoral stabilization. The posterior cruciate ligament fibers were intact or transected (excursion wires left intact) with gravitational joint distraction of the lower leg unconstrained or constrained. The small, posterior oblique fiber region was the most isometric of the four tested fiber regions. Progressively increasing deviations from isometry were seen in the posterior longitudinal, central, and anterior fiber regions, in that order. Transection of the posterior cruciate ligament, combined with unconstrained gravitational distraction of the knee joint, further increased the magnitude of deviation from isometry of the anterior and central fibers, but only changed the pattern of deviation for the more nearly isometric posterior fibers. Under simulated operative conditions, most of the posterior cruciate ligament's anatomic attachment sites exhibit nonisometric behavior, with near isometry demonstrated only by the relatively small posterior fiber attachment sites. If isometry alone is used for bone tunnel placement, the large anterior and central fiber regions will be left largely unreconstructed. Because the normal behavior of most of the fibers of the posterior cruciate ligament involves 4 to 6 mm of end-to-end length increase with progressive knee flexion, this pattern and degree of deviation from isometry should be sought to approximate an anatomic reconstruction of the anterocentral bulk of the ligament.

  19. Stardust Entry Reconstruction

    NASA Technical Reports Server (NTRS)

    Desai, Prasun N.; Qualls, Garry D.

    2008-01-01

    An overview of the reconstruction analyses performed for the Stardust capsule entry is described. The results indicate that the actual entry was very close to the pre-entry predictions. The capsule landed 8.1 km north-northwest of the desired target at Utah Test and Training Range. Analyses of infrared video footage and radar range data (obtained from tracking stations) during the descent show that drogue parachute deployment was 4.8 s later than the pre-entry prediction, while main parachute deployment was 19.3 s earlier than the pre-set timer indicating that main deployment was actually triggered by the backup baroswitch. Reconstruction of a best estimated trajectory revealed that the aerodynamic drag experienced by the capsule during hypersonic flight was within 1% of pre-entry predications. Observations of the heatshield support the pre-entry estimates of small hypersonic angles of attack, since there was very little, if any, charring of the shoulder region or the aftbody. Through this investigation, an overall assertion can be made that all the data gathered from the Stardust capsule entry were consistent with flight performance close to nominal pre-entry predictions. Consequently, the design principles and methodologies utilized for the flight dynamics, aerodynamics, and aerothermodynamics analyses have been corroborated.

  20. Parallel ptychographic reconstruction

    PubMed Central

    Nashed, Youssef S. G.; Vine, David J.; Peterka, Tom; Deng, Junjing; Ross, Rob; Jacobsen, Chris

    2014-01-01

    Ptychography is an imaging method whereby a coherent beam is scanned across an object, and an image is obtained by iterative phasing of the set of diffraction patterns. It is able to be used to image extended objects at a resolution limited by scattering strength of the object and detector geometry, rather than at an optics-imposed limit. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes, yet at the same time there is also a need to deliver reconstructed images immediately so that one can evaluate the next steps to take in an experiment. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs) and then employs novel techniques to merge sub-datasets into a single complex phase and amplitude image. Results are shown on a simulated specimen and a real dataset from an X-ray experiment conducted at a synchrotron light source. PMID:25607174

  1. Biomaterials for craniofacial reconstruction

    PubMed Central

    Neumann, Andreas; Kevenhoerster, Kevin

    2011-01-01

    Biomaterials for reconstruction of bony defects of the skull comprise of osteosynthetic materials applied after osteotomies or traumatic fractures and materials to fill bony defects which result from malformation, trauma or tumor resections. Other applications concern functional augmentations for dental implants or aesthetic augmentations in the facial region. For ostheosynthesis, mini- and microplates made from titanium alloys provide major advantages concerning biocompatibility, stability and individual fitting to the implant bed. The necessity of removing asymptomatic plates and screws after fracture healing is still a controversial issue. Risks and costs of secondary surgery for removal face a low rate of complications (due to corrosion products) when the material remains in situ. Resorbable osteosynthesis systems have similar mechanical stability and are especially useful in the growing skull. The huge variety of biomaterials for the reconstruction of bony defects makes it difficult to decide which material is adequate for which indication and for which site. The optimal biomaterial that meets every requirement (e.g. biocompatibility, stability, intraoperative fitting, product safety, low costs etc.) does not exist. The different material types are (autogenic) bone and many alloplastics such as metals (mainly titanium), ceramics, plastics and composites. Future developments aim to improve physical and biological properties, especially regarding surface interactions. To date, tissue engineered bone is far from routine clinical application. PMID:22073101

  2. Unfavourable results in thumb reconstruction

    PubMed Central

    Kumta, Samir M.

    2013-01-01

    The history of thumb reconstruction parallels the history of hand surgery. The attributes that make the thumb unique, and that the reconstructive surgeon must assess and try to restore when reconstructing a thumb, are: Position, stability, strength, length, motion, sensibility and appearance. Deficiency in any of these attributes can reduce the utility of the reconstructed thumb. A detailed assessment of the patient and his requirements needs to be performed before embarking on a thumb reconstruction. Most unsatisfactory results can be attributed to wrong choice of procedure. Component defects of the thumb are commonly treated by tissue from adjacent fingers, hand or forearm. With refinements in microsurgery, the foot has become a major source of tissue for component replacement in the thumb. Bone lengthening, osteoplastic reconstruction, pollicisation, and toe to hand transfers are the commonest methods of thumb reconstruction. Unfavourable results can be classified as functional and aesthetic. Some are common to all types of procedures. However each type of reconstruction has its own unique set of problems. Meticulous planning and execution is essential to give an aesthetic and functionally useful thumb. Secondary surgeries like tendon transfers, bone grafting, debulking, arthrodesis, may be required to correct deficiencies in the reconstruction. Attention needs to be paid to the donor site as well. PMID:24501466

  3. Exercises in PET Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Nix, Oliver

    These exercises are complementary to the theoretical lectures about positron emission tomography (PET) image reconstruction. They aim at providing some hands on experience in PET image reconstruction and focus on demonstrating the different data preprocessing steps and reconstruction algorithms needed to obtain high quality PET images. Normalisation, geometric-, attenuation- and scatter correction are introduced. To explain the necessity of those some basics about PET scanner hardware, data acquisition and organisation are reviewed. During the course the students use a software application based on the STIR (software for tomographic image reconstruction) library 1,2 which allows them to dynamically select or deselect corrections and reconstruction methods as well as to modify their most important parameters. Following the guided tutorial, the students get an impression on the effect the individual data precorrections have on image quality and what happens if they are forgotten. Several data sets in sinogram format are provided, such as line source data, Jaszczak phantom data sets with high and low statistics and NEMA whole body phantom data. The two most frequently used reconstruction algorithms in PET image reconstruction, filtered back projection (FBP) and the iterative OSEM (ordered subset expectation maximation) approach are used to reconstruct images. The exercise should help the students gaining an understanding what the reasons for inferior image quality and artefacts are and how to improve quality by a clever choice of reconstruction parameters.

  4. A geomagnetic field spectrum

    NASA Technical Reports Server (NTRS)

    Langel, R. A.; Estes, R. H.

    1982-01-01

    A spherical harmonic model of the earth's internal magnetic field of degree and order 23 is derived from selected Magsat data, and its power spectrum, computed with terms developed by Mauersberger (1956) and Lowes (1974), is found to exhibit a change of a slope at n = 14 which is interpreted as an indication that the core field dominates at values lower than 13 while the crust field dominates above a value of 15. The representations of the two portions of the spectrum obtained can be used to establish order-of-magnitude inaccuracies due to both crustal fields and the inability to observe core field wavelengths beyond n = 13, at which point they are obscured by the crustal field, in core field models.

  5. Spread spectrum image steganography.

    PubMed

    Marvel, L M; Boncelet, C R; Retter, C T

    1999-01-01

    In this paper, we present a new method of digital steganography, entitled spread spectrum image steganography (SSIS). Steganography, which means "covered writing" in Greek, is the science of communicating in a hidden manner. Following a discussion of steganographic communication theory and review of existing techniques, the new method, SSIS, is introduced. This system hides and recovers a message of substantial length within digital imagery while maintaining the original image size and dynamic range. The hidden message can be recovered using appropriate keys without any knowledge of the original image. Image restoration, error-control coding, and techniques similar to spread spectrum are described, and the performance of the system is illustrated. A message embedded by this method can be in the form of text, imagery, or any other digital signal. Applications for such a data-hiding scheme include in-band captioning, covert communication, image tamperproofing, authentication, embedded control, and revision tracking.

  6. Radio frequency spectrum management

    NASA Astrophysics Data System (ADS)

    Sujdak, E. J., Jr.

    1980-03-01

    This thesis is a study of radio frequency spectrum management as practiced by agencies and departments of the Federal Government. After a brief introduction to the international agency involved in radio frequency spectrum management, the author concentrates on Federal agencies engaged in frequency management. These agencies include the National Telecommunications and Information Administration (NTIA), the Interdepartment Radio Advisory Committee (IRAC), and the Department of Defense (DoD). Based on an analysis of Department of Defense frequency assignment procedures, recommendations are given concerning decentralizing military frequency assignment by delegating broader authority to unified commanders. This proposal includes a recommendation to colocate the individual Service frequency management offices at the Washington level. This would result in reduced travel costs, lower manpower requirements, and a common tri-Service frequency management data base.

  7. Multitaper Spectrum Estimates

    NASA Astrophysics Data System (ADS)

    Fodor, I. K.; Stark, P. B.

    Multitapering is a statistical technique developed to improve on the notorious periodogram estimate of the power spectrum (Thomson, 1982; Percival, Walden 1993). We show how to obtain orthogonal tapers for time series observed with gaps, and how to use statistical resampling techniques (Efron, Tibshirani 1993) to calculate realistic uncertainty estimates for multitaper estimates. We introduce multisegment multitapering. Multitapering can also be extended to the 2D case. We indicate how to construct tapers that minimize the spatial leakage in estimates of the spherical harmonic decomposition of the velocity images. Spatial multitapering followed by the temporal tapering of the estimated spherical harmonic time series is expected to result in improved spectrum and subsequent solar oscillation mode parameter estimates.

  8. The marine diversity spectrum.

    PubMed

    Reuman, Daniel C; Gislason, Henrik; Barnes, Carolyn; Mélin, Frédéric; Jennings, Simon

    2014-07-01

    Distributions of species body sizes within a taxonomic group, for example, mammals, are widely studied and important because they help illuminate the evolutionary processes that produced these distributions. Distributions of the sizes of species within an assemblage delineated by geography instead of taxonomy (all the species in a region regardless of clade) are much less studied but are equally important and will illuminate a different set of ecological and evolutionary processes. We develop and test a mechanistic model of how diversity varies with body mass in marine ecosystems. The model predicts the form of the 'diversity spectrum', which quantifies the distribution of species' asymptotic body masses, is a species analogue of the classic size spectrum of individuals, and which we have found to be a new and widely applicable description of diversity patterns. The marine diversity spectrum is predicted to be approximately linear across an asymptotic mass range spanning seven orders of magnitude. Slope -0.5 is predicted for the global marine diversity spectrum for all combined pelagic zones of continental shelf seas, and slopes for large regions are predicted to lie between -0.5 and -0.1. Slopes of -0.5 and -0.1 represent markedly different communities: a slope of -0.5 depicts a 10-fold reduction in diversity for every 100-fold increase in asymptotic mass; a slope of -0.1 depicts a 1.6-fold reduction. Steeper slopes are predicted for larger or colder regions, meaning fewer large species per small species for such regions. Predictions were largely validated by a global empirical analysis. Results explain for the first time a new and widespread phenomenon of biodiversity. Results have implications for estimating numbers of species of small asymptotic mass, where taxonomic inventories are far from complete. Results show that the relationship between diversity and body mass can be explained from the dependence of predation behaviour, dispersal, and life history on

  9. Radio spectrum surveillance station

    NASA Technical Reports Server (NTRS)

    Hersey, D. R.

    1979-01-01

    The paper presents a general and functional description of a low-cost surveillance station designed as the first phase of NASA's program to develop a radio spectrum surveillance capability for deep space stations for identifying radio frequency interference sources. The station described has identified several particular interferences and is yielding spectral signature data which, after cataloging, will serve as a library for rapid identification of frequently observed interference. Findings from the use of the station are discussed.

  10. Pitfalls in compressed sensing reconstruction and how to avoid them.

    PubMed

    Shchukina, Alexandra; Kasprzak, Paweł; Dass, Rupashree; Nowakowski, Michał; Kazimierczuk, Krzysztof

    2016-11-11

    Multidimensional NMR can provide unmatched spectral resolution, which is crucial when dealing with samples of biological macromolecules. The resolution, however, comes at the high price of long experimental time. Non-uniform sampling (NUS) of the evolution time domain allows to suppress this limitation by sampling only a small fraction of the data, but requires sophisticated algorithms to reconstruct omitted data points. A significant group of such algorithms known as compressed sensing (CS) is based on the assumption of sparsity of a reconstructed spectrum. Several papers on the application of CS in multidimensional NMR have been published in the last years, and the developed methods have been implemented in most spectral processing software. However, the publications rarely show the cases when NUS reconstruction does not work perfectly or explain how to solve the problem. On the other hand, every-day users of NUS develop their rules-of-thumb, which help to set up the processing in an optimal way, but often without a deeper insight. In this paper, we discuss several sources of problems faced in CS reconstructions: low sampling level, missassumption of spectral sparsity, wrong stopping criterion and attempts to extrapolate the signal too much. As an appendix, we provide MATLAB codes of several CS algorithms used in NMR. We hope that this work will explain the mechanism of NUS reconstructions and help readers to set up acquisition and processing parameters. Also, we believe that it might be helpful for algorithm developers.

  11. Fractal reconstruction of rough membrane surface related with membrane fouling in a membrane bioreactor.

    PubMed

    Zhang, Meijia; Chen, Jianrong; Ma, Yuanjun; Shen, Liguo; He, Yiming; Lin, Hongjun

    2016-09-01

    In this paper, fractal reconstruction of rough membrane surface with a modified Weierstrass-Mandelbrot (WM) function was conducted. The topography of rough membrane surface was measured by an atomic force microscopy (AFM), and the results showed that the membrane surface was isotropous. Accordingly, the fractal dimension and roughness of membrane surface were calculated by the power spectrum method. The rough membrane surface was reconstructed on the MATLAB platform with the parameter values acquired from raw AFM data. The reconstructed membrane was much similar to the real membrane morphology measured by AFM. The parameters (including average roughness and root mean square (RMS) roughness) associated with membrane morphology for the model and real membrane were calculated, and a good match of roughness parameters between the reconstructed surface and real membrane was found, indicating the feasibility of the new developed method. The reconstructed membrane surface can be potentially used for interaction energy evaluation.

  12. Biomatrices for bladder reconstruction.

    PubMed

    Lin, Hsueh-Kung; Madihally, Sundar V; Palmer, Blake; Frimberger, Dominic; Fung, Kar-Ming; Kropp, Bradley P

    2015-03-01

    There is a demand for tissue engineering of the bladder needed by patients who experience a neurogenic bladder or idiopathic detrusor overactivity. To avoid complications from augmentation cystoplasty, the field of tissue engineering seeks optimal scaffolds for bladder reconstruction. Naturally derived biomaterials as well as synthetic and natural polymers have been explored as bladder substitutes. To improve regenerative properties, these biomaterials have been conjugated with functional molecules, combined with nanotechology, or seeded with exogenous cells. Although most studies reported complete and functional bladder regeneration in small-animal models, results from large-animal models and human clinical trials varied. For functional bladder regeneration, procedures for biomaterial fabrication, incorporation of biologically active agents, introduction of nanotechnology, and application of stem-cell technology need to be standardized. Advanced molecular and medical technologies such as next generation sequencing and magnetic resonance imaging can be introduced for mechanistic understanding and non-invasive monitoring of regeneration processes, respectively.

  13. Metrological digital audio reconstruction

    DOEpatents

    Fadeyev; Vitaliy , Haber; Carl

    2004-02-19

    Audio information stored in the undulations of grooves in a medium such as a phonograph record may be reconstructed, with little or no contact, by measuring the groove shape using precision metrology methods coupled with digital image processing and numerical analysis. The effects of damage, wear, and contamination may be compensated, in many cases, through image processing and analysis methods. The speed and data handling capacity of available computing hardware make this approach practical. Two examples used a general purpose optical metrology system to study a 50 year old 78 r.p.m. phonograph record and a commercial confocal scanning probe to study a 1920's celluloid Edison cylinder. Comparisons are presented with stylus playback of the samples and with a digitally re-mastered version of an original magnetic recording. There is also a more extensive implementation of this approach, with dedicated hardware and software.

  14. Reconstruction Using Witness Complexes

    PubMed Central

    Oudot, Steve Y.

    2010-01-01

    We present a novel reconstruction algorithm that, given an input point set sampled from an object S, builds a one-parameter family of complexes that approximate S at different scales. At a high level, our method is very similar in spirit to Chew’s surface meshing algorithm, with one notable difference though: the restricted Delaunay triangulation is replaced by the witness complex, which makes our algorithm applicable in any metric space. To prove its correctness on curves and surfaces, we highlight the relationship between the witness complex and the restricted Delaunay triangulation in 2d and in 3d. Specifically, we prove that both complexes are equal in 2d and closely related in 3d, under some mild sampling assumptions. PMID:21643440

  15. Reconstructability analysis of epistasis.

    PubMed

    Zwick, Martin

    2011-01-01

    The literature on epistasis describes various methods to detect epistatic interactions and to classify different types of epistasis. Reconstructability analysis (RA) has recently been used to detect epistasis in genomic data. This paper shows that RA offers a classification of types of epistasis at three levels of resolution (variable-based models without loops, variable-based models with loops, state-based models). These types can be defined by the simplest RA structures that model the data without information loss; a more detailed classification can be defined by the information content of multiple candidate structures. The RA classification can be augmented with structures from related graphical modeling approaches. RA can analyze epistatic interactions involving an arbitrary number of genes or SNPs and constitutes a flexible and effective methodology for genomic analysis.

  16. Reconstructing the Alcatraz escape

    NASA Astrophysics Data System (ADS)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  17. Hybrid spread spectrum radio system

    DOEpatents

    Smith, Stephen F [London, TN; Dress, William B [Camas, WA

    2010-02-09

    Systems and methods are described for hybrid spread spectrum radio systems. A method, includes receiving a hybrid spread spectrum signal including: fast frequency hopping demodulating and direct sequence demodulating a direct sequence spread spectrum signal, wherein multiple frequency hops occur within a single data-bit time and each bit is represented by chip transmissions at multiple frequencies.

  18. Joint Electromagnetic Spectrum Management Operations

    DTIC Science & Technology

    2012-03-20

    synchronizing, and deconflicting JEMSMO actions (p. IV-7) Doctrine Update for JP 6-01, Joint Electromagnetic Spectrum Management Operations...communications system directorate of a joint staff (J-6), to support joint planning, coordination, and control of the spectrum for assigned forces. Executive...in the respective Service or joint publications. Interference Resolution To ensure critical frequencies and spectrum-dependent systems are

  19. Energy spectrum and transport in narrow HgTe quantum wells

    SciTech Connect

    Germanenko, A. V.; Minkov, G. M.; Rut, O. E.; Sherstobitov, A. A.; Dvoretsky, S. A.; Mikhailov, N. N.

    2015-01-15

    The results of an experimental study of the transport phenomena and the hole energy spectrum of two-dimensional systems in the quantum well of HgTe zero-gap semiconductor with normal arrangement of quantum-confinement subbands are presented. An analysis of the experimental data allows us to reconstruct the carrier energy spectrum near the hole subband extrema. The results are interpreted using the standard kP model.

  20. An update on penile reconstruction

    PubMed Central

    Garaffa, Giulio; Raheem, Amr Abdel; Ralph, David John

    2011-01-01

    Penile reconstruction still represents a formidable challenge for the urologist. In this review, the most recent advances in penile reconstruction after trauma, excision of benign and malignant disease and in patients with micropenis, aphallia or female to male gender dysphoria are reported. PMID:21540867