Science.gov

Sample records for end-to-end spectrum reconstruction

  1. Automated End-to-End Workflow for Precise and Geo-accurate Reconstructions using Fiducial Markers

    NASA Astrophysics Data System (ADS)

    Rumpler, M.; Daftry, S.; Tscharf, A.; Prettenthaler, R.; Hoppe, C.; Mayer, G.; Bischof, H.

    2014-08-01

    Photogrammetric computer vision systems have been well established in many scientific and commercial fields during the last decades. Recent developments in image-based 3D reconstruction systems in conjunction with the availability of affordable high quality digital consumer grade cameras have resulted in an easy way of creating visually appealing 3D models. However, many of these methods require manual steps in the processing chain and for many photogrammetric applications such as mapping, recurrent topographic surveys or architectural and archaeological 3D documentations, high accuracy in a geo-coordinate system is required which often cannot be guaranteed. Hence, in this paper we present and advocate a fully automated end-to-end workflow for precise and geoaccurate 3D reconstructions using fiducial markers. We integrate an automatic camera calibration and georeferencing method into our image-based reconstruction pipeline based on binary-coded fiducial markers as artificial, individually identifiable landmarks in the scene. Additionally, we facilitate the use of these markers in conjunction with known ground control points (GCP) in the bundle adjustment, and use an online feedback method that allows assessment of the final reconstruction quality in terms of image overlap, ground sampling distance (GSD) and completeness, and thus provides flexibility to adopt the image acquisition strategy already during image recording. An extensive set of experiments is presented which demonstrate the accuracy benefits to obtain a highly accurate and geographically aligned reconstruction with an absolute point position uncertainty of about 1.5 times the ground sampling distance.

  2. Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS

    NASA Astrophysics Data System (ADS)

    Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.

    Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.

  3. End-to-End Commitment

    NASA Technical Reports Server (NTRS)

    Newcomb, John

    2004-01-01

    The end-to-end test would verify the complex sequence of events from lander separation to landing. Due to the large distances involved and the significant delay time in sending a command and receiving verification, the lander needed to operate autonomously after it separated from the orbiter. It had to sense conditions, make decisions, and act accordingly. We were flying into a relatively unknown set of conditions-a Martian atmosphere of unknown pressure, density, and consistency to land on a surface of unknown altitude, and one which had an unknown bearing strength.

  4. End-to-End Radiographic Systems Simulation

    SciTech Connect

    Mathews, A.; Kwan, T.; Buescher, K.; Snell, C.; Adams, K.

    1999-07-23

    This is the final report of a one-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The objective of this project was to develop a validated end-to-end radiographic model that could be applied to both x-rays and protons. The specific objectives were to link hydrodynamic, transport, and magneto-hydrodynamic simulation software for purposes of modeling radiographic systems. In addition, optimization and analysis algorithms were to be developed to validate physical models and optimize the design of radiographic facilities.

  5. Applying Trustworthy Computing to End-to-End Electronic Voting

    ERIC Educational Resources Information Center

    Fink, Russell A.

    2010-01-01

    "End-to-End (E2E)" voting systems provide cryptographic proof that the voter's intention is captured, cast, and tallied correctly. While E2E systems guarantee integrity independent of software, most E2E systems rely on software to provide confidentiality, availability, authentication, and access control; thus, end-to-end integrity is not…

  6. Standardizing an End-to-end Accounting Service

    NASA Technical Reports Server (NTRS)

    Greenberg, Edward; Kazz, Greg

    2006-01-01

    Currently there are no space system standards available for space agencies to accomplish end-to-end accounting. Such a standard does not exist for spacecraft operations nor for tracing the relationship between the mission planning activities, the command sequences designed to perform those activities, the commands formulated to initiate those activities and the mission data and specifically the mission data products created by those activities. In order for space agencies to cross-support one another for data accountability/data tracing and for inter agency spacecraft to interoperate with each other, an international CCSDS standard for end-to-end data accountability/tracing needs to be developed. We will first describe the end-to-end accounting service model and functionality that supports the service. This model will describe how science plans that are ultimately transformed into commands can be associated with the telemetry products generated as a result of their execution. Moreover, the interaction between end-to-end accounting and service management will be explored. Finally, we will show how the standard end-to-end accounting service can be applied to a real life flight project i.e., the Mars Reconnaissance Orbiter project.

  7. Novel method for esophagojejunal anastomosis after laparoscopic total gastrectomy: Semi-end-to-end anastomosis

    PubMed Central

    Zhao, Yong-Liang; Su, Chong-Yu; Li, Teng-Fei; Qian, Feng; Luo, Hua-Xing; Yu, Pei-Wu

    2014-01-01

    AIM: To test a new safe and simple technique for circular-stapled esophagojejunostomy in laparoscopic total gastrectomy (LATG). METHODS: We selected 26 patients with gastric cancer who underwent LATG and Roux-en-Y gastrointestinal reconstruction with semi-end-to-end esophagojejunal anastomosis. RESULTS: LATG with semi-end-to-end esophagojejunal anastomosis was successfully performed in all 26 patients. The average operation time was 257 ± 36 min, with an average anastomosis time of 51 ± 17 min and an average intraoperative blood loss of 88 ± 46 mL. The average postoperative hospital stay was 8 ± 3 d. There were no complications and no mortality in this series. CONCLUSION: The application of semi-end-to-end esophagojejunal anastomosis after LATG is a safe and feasible procedure, which can be easily performed and has a short operation time in terms of anastomosis. PMID:25309086

  8. Combining Simulation Tools for End-to-End Trajectory Optimization

    NASA Technical Reports Server (NTRS)

    Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min

    2015-01-01

    Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.

  9. LWS/SET End-to-End Data System

    NASA Technical Reports Server (NTRS)

    Giffin, Geoff; Sherman, Barry; Colon, Gilberto (Technical Monitor)

    2002-01-01

    This paper describes the concept for the End-to-End Data System that will support NASA's Living With a Star Space Environment Testbed missions. NASA has initiated the Living With a Star (LWS) Program to develop a better scientific understanding to address the aspects of the connected Sun-Earth system that affect life and society. A principal goal of the program is to bridge the gap.between science, engineering, and user application communities. The Space Environment Testbed (SET) Project is one element of LWS. The Project will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The End-to-end data system allows investigators to access the SET control center, command their experiments, and receive data from their experiments back at their home facility, using the Internet. The logical functioning of major components of the end-to-end data system are described, including the GSFC Payload Operations Control Center (POCC), SET Payloads, the GSFC SET Simulation Lab, SET Experiment PI Facilities, and Host Systems. Host Spacecraft Operations Control Centers (SOCC) and the Host Spacecraft are essential links in the end-to-end data system, but are not directly under the control of the SET Project. Formal interfaces will be established between these entities and elements of the SET Project. The paper describes data flow through the system, from PI facilities connecting to the SET operations center via the Internet, communications to SET carriers and experiments via host systems, to telemetry returns to investigators from their flight experiments. It also outlines the techniques that will be used to meet mission requirements, while holding development and operational costs to a minimum. Additional information is included in the original extended abstract.

  10. Measurements and analysis of end-to-end Internet dynamics

    SciTech Connect

    Paxson, V

    1997-04-01

    Accurately characterizing end-to-end Internet dynamics - the performance that a user actually obtains from the lengthy series of network links that comprise a path through the Internet - is exceptionally difficult, due to the network`s immense heterogeneity. At the heart of this work is a `measurement framework` in which a number of sites around the Internet host a specialized measurement service. By coordinating `probes` between pairs of these sites one can measure end-to-end behavior along O(N{sup 2}) paths for a framework consisting of N sites. Consequently, one obtains a superlinear scaling that allows measuring a rich cross-section of Internet behavior without requiring huge numbers of observation points. 37 sites participated in this study, allowing the author to measure more than 1,000 distinct Internet paths. The first part of this work looks at the behavior of end-to-end routing: the series of routers over which a connection`s packets travel. Based on 40,000 measurements made using this framework, the author analyzes: routing `pathologies` such as loops, outages, and flutter; the stability of routes over time; and the symmetry of routing along the two directions of an end-to-end path. The author finds that pathologies increased significantly over the course of 1995 and that Internet paths are heavily dominated by a single route. The second part of this work studies end-to-end Internet packet dynamics. The author analyzes 20,000 TCP transfers of 100 Kbyte each to investigate the performance of both the TCP endpoints and the Internet paths. The measurements used for this part of the study are much richer than those for the first part, but require a great degree of attention to issues of calibration, which are addressed by applying self-consistency checks to the measurements whenever possible. The author finds that packet filters are capable of a wide range of measurement errors, some of which, if undetected, can significantly taint subsequent analysis.

  11. End-to-end network/application performance troubleshooting methodology

    SciTech Connect

    Wu, Wenji; Bobyshev, Andrey; Bowden, Mark; Crawford, Matt; Demar, Phil; Grigaliunas, Vyto; Grigoriev, Maxim; Petravick, Don; /Fermilab

    2007-09-01

    The computing models for HEP experiments are globally distributed and grid-based. Obstacles to good network performance arise from many causes and can be a major impediment to the success of the computing models for HEP experiments. Factors that affect overall network/application performance exist on the hosts themselves (application software, operating system, hardware), in the local area networks that support the end systems, and within the wide area networks. Since the computer and network systems are globally distributed, it can be very difficult to locate and identify the factors that are hurting application performance. In this paper, we present an end-to-end network/application performance troubleshooting methodology developed and in use at Fermilab. The core of our approach is to narrow down the problem scope with a divide and conquer strategy. The overall complex problem is split into two distinct sub-problems: host diagnosis and tuning, and network path analysis. After satisfactorily evaluating, and if necessary resolving, each sub-problem, we conduct end-to-end performance analysis and diagnosis. The paper will discuss tools we use as part of the methodology. The long term objective of the effort is to enable site administrators and end users to conduct much of the troubleshooting themselves, before (or instead of) calling upon network and operating system 'wizards,' who are always in short supply.

  12. Miniature modular microwave end-to-end receiver

    NASA Technical Reports Server (NTRS)

    Sukamto, Lin M. (Inventor); Cooley, Thomas W. (Inventor); Janssen, Michael A. (Inventor); Parks, Gary S. (Inventor)

    1993-01-01

    An end-to-end microwave receiver system contained in a single miniature hybrid package mounted on a single heatsink is presented. It includes an input end connected to a microwave receiver antenna and an output end which produces a digital count proportional to the amplitude of a signal of a selected microwave frequency band received at the antenna and corresponding to one of the water vapor absorption lines near frequencies of 20 GHz or 30 GHz. The hybrid package is on the order of several centimeters in length and a few centimeters in height and width. The package includes an L-shaped carrier having a base surface, a vertical wall extending up from the base surface and forming a corner therewith, and connection pins extending through the vertical wall. Modular blocks rest on the base surface against the vertical wall and support microwave monolithic integrated circuits on top surfaces thereof connected to the external connection pins. The modular blocks lie end-to-end on the base surface so as to be modularly removable by sliding along the base surface beneath the external connection pins away from the vertical wall.

  13. Euclid end-to-end straylight performance assessment

    NASA Astrophysics Data System (ADS)

    Gaspar Venancio, Luis M.; Pachot, Charlotte; Carminati, Lionel; Lorenzo Alvarez, Jose; Amiaux, Jérôme; Prieto, Eric; Bonino, Luciana; Salvignol, Jean-Christophe; Short, Alex; Boenke, Tobias; Strada, Paulo; Laureijs, Rene

    2016-07-01

    In the Euclid mission the straylight has been identified at an early stage as the main driver for the final imaging quality of the telescope. The assessment by simulation of the final straylight in the focal plane of both instruments in Euclid's payload have required a complex workflow involving all stakeholders in the mission, from industry to the scientific community. The straylight is defined as a Normalized Detector Irradiance (NDI) which is a convenient definition tool to separate the contributions of the telescope and of the instruments. The end-to-end straylight of the payload is then simply the sum of the NDIs of the telescope and of each instrument. The NDIs for both instruments are presented in this paper for photometry and spectrometry.

  14. END-TO-END SIMULATIONS FOR THE EBIS PREINJECTOR.

    SciTech Connect

    RAPARIA,D.; ALESSI, J.; KPONOU, A.; PIKIN, A.; RITTER, J.; MINAEV, S.; RATZINGER, U.; SCHEMPP, A.; TIEDE, R.

    2007-06-25

    The EBIS Project at Brookhaven National Laboratory is in the second year of a four-year project. It will replace the Tandem Van de Graaff accelerators with an Electron Beam Ion Source, an RFQ, and one IH Linac cavity, as the heavy ion preinjector for the Relativistic Heavy Ion Collider (RHIC), and for the NASA Space Radiation Laboratory (NSRL). The preinjector will provide all ions species, He to U, (Q/m >0.16) at 2 MeV/amu at a repetition rate of 5 Hz, pulse length of 10-40 {micro}s, and intensities of {approx}2.0 mA. End-to-end simulations (from EBIS to the Booster injection) as well as error sensitivity studies will be presented and physics issues will be discussed.

  15. End-to end performance of the TESAR ATR system

    NASA Astrophysics Data System (ADS)

    Rosario, Dalton S.

    2000-08-01

    The TESAR [Tactical Endurance Synthetic Aperture Radar (SAR)] system uses four algorithms in its three-stage algorithmic approach to the detection and identification of targets in continuous real-time, 1-ft-resolution, strip SAR image data. The first stage employs a multitarget detector with a built-in natural/cultural false-alarm mitigator. The second stage provides target hypotheses for the candidate targets and refines their angular pose. The third stage, consisting of two template-based algorithms, produces final target-identification decisions. This paper reviews the end- to-end ATR performance achieved by the TESAR system in preparation for a 1998 field demonstration at Aberdeen Proving Ground, Aberdeen, MD. The discussion includes an overview of the algorithm suite, the system's unique capabilities, and its overall performance against eight ground targets.

  16. Response to MRO's end-to-end data accountability challenges

    NASA Technical Reports Server (NTRS)

    Lee, Young H.

    2005-01-01

    (MRO) on August 12, 2005. It carries six science instruments and three engineering payloads. Because MRO will produce an unprecedented number of science products, it will transmit a much higher data volume via high data rate than any other deep space mission to date. Keeping track of MRO products as well as relay products would be a daunting, expensive task without a well-planned data-product tracking strategy. To respond to this challenge, the MRO project developed the End-to- End Data Accountability System by utilizing existing information available from both ground and flight elements. Therefore, a capability to perform first-order problem diagnosis is essential in order for MRO to answer the questions, where is my data? and when will my data be available? This paper details the approaches taken, design and implementation of the tools, procedures and teams that track data products from the time they are predicted until they arrive in the hands of the end users.

  17. Recirculating Linac Acceleration - End-to-End Simulation

    SciTech Connect

    Alex Bogacz

    2010-03-01

    A conceptual design of a high-pass-number Recirculating Linear Accelerator (RLA) for muons is presented. The scheme involves three superconducting linacs (201 MHz): a single pass linear Pre-accelerator followed by a pair multi-pass (4.5-pass) 'Dogbone' RLAs. Acceleration starts after ionization cooling at 220 MeV/c and proceeds to 12.6 GeV. The Pre-accelerator captures a large muon phase space and accelerates muons to relativistic energies, while adiabatically decreasing the phase-space volume, so that effective acceleration in the RLA is possible. The RLA further compresses and shapes up the longitudinal and transverse phase-spaces, while increasing the energy. Appropriate choice of multi-pass linac optics based on FODO focusing assures large number of passes in the RLA. The proposed 'Dogbone' configuration facilitates simultaneous acceleration of both mu± species through the requirement of mirror symmetric optics of the return 'droplet' arcs. Finally, presented end-to-end simulation validates the efficiency and acceptance of the accelerator system.

  18. OGC standards for end-to-end sensor network integration

    NASA Astrophysics Data System (ADS)

    Headley, K. L.; Broering, A.; O'Reilly, T. C.; Toma, D.; Del Rio, J.; Bermudez, L. E.; Zedlitz, J.; Johnson, G.; Edgington, D.

    2010-12-01

    technology, and can communicate with any sensor whose protocol can be described by a SID. The SID interpreter transfers retrieved sensor data to a Sensor Observation Service, and transforms tasks submitted to a Sensor Planning Service to actual sensor commands. The proposed SWE PUCK protocol complements SID by providing a standard way to associate a sensor with a SID, thereby completely automating the sensor integration process. PUCK protocol is implemented in sensor firmware, and provides a means to retrieve a universally unique identifer, metadata and other information from the device itself through its communication interface. Thus the SID interpreter can retrieve a SID directly from the sensor through PUCK protocol. Alternatively the interpreter can retrieve the sensor’s SID from an external source, based on the unique sensor ID provided by PUCK protocol. In this presentation, we describe the end-to-end integration of several commercial oceanographic instruments into a sensor network using PUCK, SID and SWE services. We also present a user-friendly, graphical tool to generate SIDs and tools to visualize sensor data.

  19. Data analysis pipeline for EChO end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Waldmann, Ingo P.; Pascale, E.

    2015-12-01

    Atmospheric spectroscopy of extrasolar planets is an intricate business. Atmospheric signatures typically require a photometric precision of 1×10-4 in flux over several hours. Such precision demands high instrument stability as well as an understanding of stellar variability and an optimal data reduction and removal of systematic noise. In the context of the EChO mission concept, we here discuss the data reduction and analysis pipeline developed for the EChO end-to-end simulator EChOSim. We present and discuss the step by step procedures required in order to obtain the final exoplanetary spectrum from the EChOSim `raw data' using a simulated observation of the secondary eclipse of the hot-Neptune 55 Cnc e.

  20. Experimental demonstration of software defined data center optical networks with Tbps end-to-end tunability

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie; Ji, Yuefeng; Li, Hui; Wang, Huitao; Ge, Chao

    2015-10-01

    The end-to-end tunability is important to provision elastic channel for the burst traffic of data center optical networks. Then, how to complete the end-to-end tunability based on elastic optical networks? Software defined networking (SDN) based end-to-end tunability solution is proposed for software defined data center optical networks, and the protocol extension and implementation procedure are designed accordingly. For the first time, the flexible grid all optical networks with Tbps end-to-end tunable transport and switch system have been online demonstrated for data center interconnection, which are controlled by OpenDayLight (ODL) based controller. The performance of the end-to-end tunable transport and switch system has been evaluated with wavelength number tuning, bit rate tuning, and transmit power tuning procedure.

  1. MRI simulation: end-to-end testing for prostate radiation therapy using geometric pelvic MRI phantoms

    NASA Astrophysics Data System (ADS)

    Sun, Jidi; Dowling, Jason; Pichler, Peter; Menk, Fred; Rivest-Henault, David; Lambert, Jonathan; Parker, Joel; Arm, Jameen; Best, Leah; Martin, Jarad; Denham, James W.; Greer, Peter B.

    2015-04-01

    To clinically implement MRI simulation or MRI-alone treatment planning requires comprehensive end-to-end testing to ensure an accurate process. The purpose of this study was to design and build a geometric phantom simulating a human male pelvis that is suitable for both CT and MRI scanning and use it to test geometric and dosimetric aspects of MRI simulation including treatment planning and digitally reconstructed radiograph (DRR) generation. A liquid filled pelvic shaped phantom with simulated pelvic organs was scanned in a 3T MRI simulator with dedicated radiotherapy couch-top, laser bridge and pelvic coil mounts. A second phantom with the same external shape but with an internal distortion grid was used to quantify the distortion of the MR image. Both phantoms were also CT scanned as the gold-standard for both geometry and dosimetry. Deformable image registration was used to quantify the MR distortion. Dose comparison was made using a seven-field IMRT plan developed on the CT scan with the fluences copied to the MR image and recalculated using bulk electron densities. Without correction the maximum distortion of the MR compared with the CT scan was 7.5 mm across the pelvis, while this was reduced to 2.6 and 1.7 mm by the vendor’s 2D and 3D correction algorithms, respectively. Within the locations of the internal organs of interest, the distortion was <1.5 and <1 mm with 2D and 3D correction algorithms, respectively. The dose at the prostate isocentre calculated on CT and MRI images differed by 0.01% (1.1 cGy). Positioning shifts were within 1 mm when setup was performed using MRI generated DRRs compared to setup using CT DRRs. The MRI pelvic phantom allows end-to-end testing of the MRI simulation workflow with comparison to the gold-standard CT based process. MRI simulation was found to be geometrically accurate with organ dimensions, dose distributions and DRR based setup within acceptable limits compared to CT.

  2. A Computer Program for the Distribution of End-to-End Distances in Polymer Molecules

    ERIC Educational Resources Information Center

    Doorne, William Van; And Others

    1976-01-01

    Describes a Fortran program that illustrates how the end-to-end distances in randomly coiled polymer molecules is affected by varying the number and lengths of chains and the angles between them. (MLH)

  3. An end-to-end command and control concept for NASA data systems

    NASA Technical Reports Server (NTRS)

    Desjardins, R.

    1979-01-01

    Spacecraft command and control are currently taking on the characteristics of general computer-to-computer interprocess communication. The evolution of these systems during the 1980s will give NASA a true general-purpose end-to-end data request capability for the first time. This concept is presented in outline, with consideration of many of the detailed analyses and subsystem tradeoffs being performed as part of the NEEDS (NASA End-to-End Data System) program.

  4. End-to-End Models for Effects of System Noise on LIMS Analysis of Igneous Rocks

    SciTech Connect

    Clegg, Samuel M; Bender, Steven; Wiens, R. C.; Carmosino, Marco L; Speicher, Elly A; Dyar, M. D.

    2010-12-23

    The ChemCam instrument on the Mars Science Laboratory will be the first extraterrestial deployment of laser-induced breakdown spectroscopy (UBS) for remote geochemical analysis. LIBS instruments are also being proposed for future NASA missions. In quantitative LIBS applications using multivariate analysis techniques, it is essential to understand the effects of key instrument parameters and their variability on the elemental predictions. Baseline experiments were run on a laboratory instrument in conditions reproducing ChemCam performance on Mars. These experiments employed Nd:YAG laser producing 17 mJ/pulse on target and an with a 200 {micro}m FWHM spot size on the surface of a sample. The emission is collected by a telescope, imaged on a fiber optic and then interfaced to a demultiplexer capable of >40% transmission into each spectrometer. We report here on an integrated end-to-end system performance model that simulates the effects of output signal degradation that might result from the input signal chain and the impact on multivariate model predictions. There are two approaches to modifying signal to noise (SNR): degrade the signal and/or increase the noise. Ishibashi used a much smaller data set to show that the addition of noise had significant impact while degradation of spectral resolution had much less impact on accuracy and precision. Here, we specifically focus on aspects of remote LIBS instrument performance as they relate to various types of signal degradation. To assess the sensitivity of LIBS analysis to signal-to-noise ratio (SNR) and spectral resolution, the signal in each spectrum from a suite of 50 laboratory spectra of igneous rocks was variably degraded by increasing the peak widths (simulating misalignment) and decreasing the spectral amplitude (simulating decreases in SNR).

  5. A NASA Climate Model Data Services (CDS) End-to-End System to Support Reanalysis Intercomparison

    NASA Astrophysics Data System (ADS)

    Carriere, L.; Potter, G. L.; McInerney, M.; Nadeau, D.; Shen, Y.; Duffy, D.; Schnase, J. L.; Maxwell, T. P.; Huffer, E.

    2014-12-01

    The NASA Climate Model Data Service (CDS) and the NASA Center for Climate Simulation (NCCS) are collaborating to provide an end-to-end system for the comparative study of the major Reanalysis projects, currently, ECMWF ERA-Interim, NASA/GMAO MERRA, NOAA/NCEP CFSR, NOAA/ESRL 20CR, and JMA JRA25. Components of the system include the full spectrum of Climate Model Data Services; Data, Compute Services, Data Services, Analytic Services and Knowledge Services. The Data includes standard Reanalysis model output, and will be expanded to include gridded observations, and gridded Innovations (O-A and O-F). The NCCS High Performance Science Cloud provides the compute environment (storage, servers, and network). Data Services are provided through an Earth System Grid Federation (ESGF) data node complete with Live Access Server (LAS), Web Map Service (WMS) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) for visualization, as well as a collaborative interface through the Earth System CoG. Analytic Services include UV-CDAT for analysis and MERRA/AS, accessed via the CDS API, for computation services, both part of the CDS Climate Analytics as a Service (CAaaS). Knowledge Services include access to an Ontology browser, ODISEES, for metadata search and data retrieval. The result is a system that provides the ability for both reanalysis scientists and those scientists in need of reanalysis output to identify the data of interest, compare, compute, visualize, and research without the need for transferring large volumes of data, performing time consuming format conversions, and writing code for frequently run computations and visualizations.

  6. An end-to-end communications architecture for condition-based maintenance applications

    NASA Astrophysics Data System (ADS)

    Kroculick, Joseph

    2014-06-01

    This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.

  7. End-to-End Information System design at the NASA Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    Recognizing a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote space-based sensor, an end-to-end approach to the design of information systems has been adopted at the Jet Propulsion Laboratory. The objectives of this effort are to ensure that all flight projects adequately cope with information flow problems at an early stage of system design, and that cost-effective, multi-mission capabilities are developed when capital investments are made in supporting elements. The paper reviews the End-to-End Information System (EEIS) activity at the Laboratory, and notes the ties to the NASA End-to-End Data System program.

  8. Radio science requirements and the end-to-end ranging system

    NASA Technical Reports Server (NTRS)

    Berman, A. L.

    1978-01-01

    Radio science ranging requirements negotiated between past and present flight projects and the DSN have generally focused on just the DSS and spacecraft hardware. All elements in the end-to-end system are analyzed and considered in terms of the error hierarchy. The end-to-end system is defined and examined as it applies to the generation of radio science ranging requirements. The variability of the performance levels of the system elements is emphasized with respect to the radio science experiment being performed and the DSN-spacecraft frequency band configuration.

  9. End-to-End Data System Architecture for the Space Station Biological Research Project

    NASA Technical Reports Server (NTRS)

    Mian, Arshad; Scimemi, Sam; Adeni, Kaiser; Picinich, Lou; Ramos, Rubin (Technical Monitor)

    1998-01-01

    The Space Station Biological Research Project (SSBRP) Is developing hardware referred to as the "facility" for providing life sciences research capability on the International Space Station. This hardware includes several biological specimen habitats, habitat holding racks, a centrifuge and a glovebox. An SSBRP end to end data system architecture has been developed to allow command and control of the facility from the ground, either with crew assistance or autonomously. The data system will be capable of handling commands, sensor data, and video from multiple cameras. The data will traverse through several onboard and ground networks and processing entities including the SSBRP and Space Station onboard and ground data systems. A large number of onboard and ground (,entities of the data system are being developed by the Space Station Program, other NASA centers and the International Partners. The SSBRP part of the system which includes the habitats, holding racks, and the ground operations center, User Operations Facility (UOF) will be developed by a multitude of geographically distributed development organizations. The SSBRP has the responsibility to define the end to end data and communications systems to make the interfaces manageable and verifiable with multiple contractors with widely varying development constraints and schedules. This paper provides an overview of the SSBRP end-to-end data system. Specifically, it describes the hardware, software and functional interactions of individual systems, and interface requirements among various entities of the end-to-end system.

  10. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    NASA Technical Reports Server (NTRS)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  11. Coupling low and high trophic levels models: Towards a pathways-orientated approach for end-to-end models

    NASA Astrophysics Data System (ADS)

    Shin, Yunne-Jai; Travers, Morgane; Maury, Olivier

    2010-01-01

    Existing models of marine ecosystems address specific issues related to the bottom-up forcing of production or to the top-down effects of fishing on a limited range of the trophic spectrum. Very few existing models explicitly incorporate the dynamics from one end of the ecosystem to the other and thus allowing the exploration of interplay between exploitation and climate effects. The shift to an ecosystem approach to fisheries and concerns about the ecological effects of climate change require the assemblage of knowledge assembled from the respective marine disciplines with the view to build end-to-end models of marine ecosystems. Here, with a focus on plankton and fish models, we present some issues and recommendations for the integration of models between trophic levels (vertical integration) and within functional groups (horizontal integration within trophic levels). At present, vertical coupling of plankton and fish models is mainly realized through predation processes, generally represented as a functional response. In the absence of empirical evidence and quantification, the choice of the functional response term is often made by default, and is reduced to a parameterization problem. A strategy is proposed to overcome this arbitrary choice. In addition to the vertical coupling of trophic models, the structure of end-to-end models incorporates biodiversity via horizontal integration of trophic levels. For guiding the selection of key components to be included in end-to-end models, the idea that marine food webs are structured as alternative trophic pathways is highlighted and related to observed dynamics. We suggest that an important early step in model development is the identification of major trophic pathways and bottlenecks in an ecosystem using a historical perspective.

  12. End-to-end RMS error testing on a constant bandwidth FM/FM system

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Salter, W. E.

    1972-01-01

    End-to-end root-mean-square (rms) tests performed on a constant bandwidth FM/FM system with various settings of system parameters are reported. The testing technique employed is that of sampling, digitizing, delaying, and comparing the analog input against the sampled and digitized corresponding output. Total system error is determined by fully loading all channels with band-limited noise and conducting end-to-end rms error tests on one channel. Tests are also conducted with and without a transmission link and plots of rms errors versus receiver signal-to-noise (S/N) values are obtained. The combined effects of intermodulation, adjacent channel crosstalk, and residual system noise are determined as well as the single channel distortion of the system.

  13. Sutureless end-to-end ureteral anastomosis using a new albumin stent and diode laser

    NASA Astrophysics Data System (ADS)

    Xie, Hua; Shaffer, Brian S.; Prahl, Scott A.; Gregory, Kenton W.

    1999-09-01

    Sutureless end to end ureteral anastomoses was successfully constructed in acute and chronic experiments. A photothermal sensitive hydrolyzable (PSH) albumin stent played roles as solder and intraluminal supporter to adhesion and position the anastomosed ureter by end to end fashion. The anastomosis seam was lased with 810 nm diode laser energy supplied through hand- held 600 micrometers noncontact optical fiber. A continuous 1 watt wave of power was applied for laser anastomosis. Integrity, welding strength, bursting pressures of anastomosis and histological reaction, and radiological phenomena were compared to those of anastomoses constructed using a liquidity soldering technique. The acute results of two methods were equivalent at welding strengths, but the liquid soldering showed more energy consumption. At chronic study, the radiological and histological studies were performed to evaluate the complications of the anastomosis. Excellent heating and varied degrees of complications were observed. We conclude that PSH stent showed great promise for ureteral anastomosis using laser welding.

  14. Laser welding with an albumin stent: experimental ureteral end-to-end anastomosis

    NASA Astrophysics Data System (ADS)

    Xie, Hua; Shaffer, Brian S.; Prahl, Scott A.; Gregory, Kenton W.

    2000-05-01

    Porcine ureters were anastomosed using an albumin stent and diode laser in vitro. The albumin stent provided precise apposition for an end to end anastomosis and enhanced welding strength. The anastomosis seam was lasered with an 810 nm diode laser using continuous wave and pulse light through a hand-held 600 micrometer noncontact optical fiber. Tensile strength, burst pressures, operative times, total energy and thermal damaged were measured in this study. The results demonstrated that using an albumin stent to laser weld ureteral anastomoses produces strong weld strengths. The liquid albumin solder also provided satisfactory welding strength. There were no significant differences of tissue thermal damage between the albumin stent alone, liquid solder alone and both combination groups. Thermal damage to tissue depended on laser setting and energy. This study determined the appropriate laser setting parameters to perform in vivo ureteral end to end anastomosis.

  15. End-to-end calculation of the radiation characteristics of VVER-1000 spent fuel assemblies

    NASA Astrophysics Data System (ADS)

    Linge, I. I.; Mitenkova, E. F.; Novikov, N. V.

    2012-12-01

    The results of end-to-end calculation of the radiation characteristics of VVER-1000 spent nuclear fuel are presented. Details of formation of neutron and gamma-radiation sources are analyzed. Distributed sources of different types of radiation are considered. A comparative analysis of calculated radiation characteristics is performed with the use of nuclear data from different ENDF/B and EAF files and ANSI/ANS and ICRP standards.

  16. Endovascular management of a late saccular aortic aneurysm after end-to-end repair of coarctation.

    PubMed

    Kotoulas, Christophoros; Tzilalis, Vasileios; Spyridakis, Emmanouil; Mamareli, Ioannis

    2011-12-01

    Post-coarctation surgical repair aneurysm formation is observed rarely with end-to-end anastomosis technique. The redo surgery is associated with high mortality and morbidity rate. Although the minimal invasive method with stented grafts has been reported in only small number of patients, this could represent a valid alternative treatment. We present a case of successful endovascular treatment of a patient with a late post-coarctation repair saccular aneurysm.

  17. NASA/DOD earth orbit shuttle traffic models based on end to end loading of payloads

    NASA Technical Reports Server (NTRS)

    Kincade, R. E.; Donahoo, M. E.; Pruett, W. R.

    1971-01-01

    An analysis of the spacecraft configurations and space missions for the Earth Orbit Shuttle traffic model based on an end-to-end loading of payloads is presented. Two possible reusable tugs are considered. The space missions are described with respect to the following: (1) number of earth orbit shuttle flights by inclination, (2) total payloads to orbit, (3) energy stages required, and (4) characteristics of reusable tug.

  18. CHEETAH: circuit-switched high-speed end-to-end transport architecture

    NASA Astrophysics Data System (ADS)

    Veeraraghavan, Malathi; Zheng, Xuan; Lee, Hyuk; Gardner, M.; Feng, Wuchun

    2003-10-01

    Leveraging the dominance of Ethernet in LANs and SONET/SDH in MANs and WANs, we propose a service called CHEETAH (Circuit-switched High-speed End-to-End Transport ArcHitecture). The service concept is to provide end hosts with high-speed, end-to-end circuit connectivity on a call-by-call shared basis, where a "circuit" consists of Ethernet segments at the ends that are mapped into Ethernet-over-SONET long-distance circuits. This paper focuses on the file-transfer application for such circuits. For this application, the CHEETAH service is proposed as an add-on to the primary Internet access service already in place for enterprise hosts. This allows an end host that is sending a file to first attempt setting up an end-to-end Ethernet/EoS circuit, and if rejected, fall back to the TCP/IP path. If the circuit setup is successful, the end host will enjoy a much shorter file-transfer delay than on the TCP/IP path. To determine the conditions under which an end host with access to the CHEETAH service should attempt circuit setup, we analyze mean file-transfer delays as a function of call blocking probability in the circuit-switched network, probability of packet loss in the IP network, round-trip times, link rates, and so on.

  19. An end-to-end approach to developing biological and chemical detector requirements

    NASA Astrophysics Data System (ADS)

    Teclemariam, Nerayo P.; Purvis, Liston K.; Foltz, Greg W.; West, Todd; Edwards, Donna M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    2009-05-01

    Effective defense against chemical and biological threats requires an "end-to-end" strategy that encompasses the entire problem space, from threat assessment and target hardening to response planning and recovery. A key element of the strategy is the definition of appropriate system requirements for surveillance and detection of threat agents. Our end-to-end approach to venue chem/bio defense is captured in the Facilities Weapons of Mass Destruction Decision Analysis Capability (FacDAC), an integrated system-of-systems toolset that can be used to generate requirements across all stages of detector development. For example, in the early stage of detector development the approach can be used to develop performance targets (e.g., sensitivity, selectivity, false positive rate) to provide guidance on what technologies to pursue. In the development phase, after a detector technology has been selected, the approach can aid in determining performance trade-offs and down-selection of competing technologies. During the application stage, the approach can be employed to design optimal defensive architectures that make the best use of available technology to maximize system performance. This presentation will discuss the end-to-end approach to defining detector requirements and demonstrate the capabilities of the FacDAC toolset using examples from a number of studies for the Department of Homeland Security.

  20. Ocean Acidification Scientific Data Stewardship: An approach for end-to-end data management and integration

    NASA Astrophysics Data System (ADS)

    Arzayus, K. M.; Garcia, H. E.; Jiang, L.; Michael, P.

    2012-12-01

    As the designated Federal permanent oceanographic data center in the United States, NOAA's National Oceanographic Data Center (NODC) has been providing scientific stewardship for national and international marine environmental and ecosystem data for over 50 years. NODC is supporting NOAA's Ocean Acidification Program and the science community by providing end-to-end scientific data management of ocean acidification (OA) data, dedicated online data discovery, and user-friendly access to a diverse range of historical and modern OA and other chemical, physical, and biological oceanographic data. This effort is being catalyzed by the NOAA Ocean Acidification Program, but the intended reach is for the broader scientific ocean acidification community. The first three years of the project will be focused on infrastructure building. A complete ocean acidification data content standard is being developed to ensure that a full spectrum of ocean acidification data and metadata can be stored and utilized for optimal data discovery and access in usable data formats. We plan to develop a data access interface capable of allowing users to constrain their search based on real-time and delayed mode measured variables, scientific data quality, their observation types, the temporal coverage, methods, instruments, standards, collecting institutions, and the spatial coverage. In addition, NODC seeks to utilize the existing suite of international standards (including ISO 19115-2 and CF-compliant netCDF) to help our data producers use those standards for their data, and help our data consumers make use of the well-standardized metadata-rich data sets. These tools will be available through our NODC Ocean Acidification Scientific Data Stewardship (OADS) web page at http://www.nodc.noaa.gov/oceanacidification. NODC also has a goal to provide each archived dataset with a unique ID, to ensure a means of providing credit to the data provider. Working with partner institutions, such as the

  1. End-to-end planning and scheduling systems technology for space operations

    NASA Technical Reports Server (NTRS)

    Moe, Karen L.

    1992-01-01

    Consideration is given to planning and scheduling operations concepts from an end-to-end perspective, through both mission operations and institutional support functions. An operations concept is proposed which is based on a flexible request language used to state resource requirements and mission constraints to a scheduling system. The language has the potential to evolve into an international standard for exchanging service request information on international space networks. The key benefit of the flexible scheduling request concept is the shift of a significant conflict resolution effort from humans to computers, reducing the time for generating a week's worth of schedules to hours instead of days.

  2. Satellite/Terrestrial Networks: End-to-End Communication Interoperability Quality of Service Experiments

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    1998-01-01

    Various issues associated with satellite/terrestrial end-to-end communication interoperability are presented in viewgraph form. Specific topics include: 1) Quality of service; 2) ATM performance characteristics; 3) MPEG-2 transport stream mapping to AAL-5; 4) Observation and discussion of compressed video tests over ATM; 5) Digital video over satellites status; 6) Satellite link configurations; 7) MPEG-2 over ATM with binomial errors; 8) MPEG-2 over ATM channel characteristics; 8) MPEG-2 over ATM over emulated satellites; 9) MPEG-2 transport stream with errors; and a 10) Dual decoder test.

  3. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Bolcar, Matt; Liu, Alice; Guyon, Olivier; Stark,Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance.

  4. Screening California Current fishery management scenarios using the Atlantis end-to-end ecosystem model

    NASA Astrophysics Data System (ADS)

    Kaplan, Isaac C.; Horne, Peter J.; Levin, Phillip S.

    2012-09-01

    End-to-end marine ecosystem models link climate and oceanography to the food web and human activities. These models can be used as forecasting tools, to strategically evaluate management options and to support ecosystem-based management. Here we report the results of such forecasts in the California Current, using an Atlantis end-to-end model. We worked collaboratively with fishery managers at NOAA’s regional offices and staff at the National Marine Sanctuaries (NMS) to explore the impact of fishery policies on management objectives at different spatial scales, from single Marine Sanctuaries to the entire Northern California Current. In addition to examining Status Quo management, we explored the consequences of several gear switching and spatial management scenarios. Of the scenarios that involved large scale management changes, no single scenario maximized all performance metrics. Any policy choice would involve trade-offs between stakeholder groups and policy goals. For example, a coast-wide 25% gear shift from trawl to pot or longline appeared to be one possible compromise between an increase in spatial management (which sacrificed revenue) and scenarios such as the one consolidating bottom impacts to deeper areas (which did not perform substantially differently from Status Quo). Judged on a coast-wide scale, most of the scenarios that involved minor or local management changes (e.g. within Monterey Bay NMS only) yielded results similar to Status Quo. When impacts did occur in these cases, they often involved local interactions that were difficult to predict a priori based solely on fishing patterns. However, judged on the local scale, deviation from Status Quo did emerge, particularly for metrics related to stationary species or variables (i.e. habitat and local metrics of landed value or bycatch). We also found that isolated management actions within Monterey Bay NMS would cause local fishers to pay a cost for conservation, in terms of reductions in landed

  5. End-to-end microvascular anastomoses with a 1.9-un diode laser

    NASA Astrophysics Data System (ADS)

    Mordon, Serge R.; Martinot, Veronique L.; Mitchell, Valerie A.

    1996-01-01

    This in-vivo study examines the interest of vessel anastomosis with a 1.9 micrometer diode laser. Ten end-to-end carotid anastomoses and 10 end-to-end jugular anastomoses are performed in Wistar rats. The technique requires brief applications (20 to 25 spots) with a diode laser (lambda equals 1.9 micrometer, (phi) equals 220 micrometer, P equals 60 mW, t equals 0.7 s, F equals 110 J/cm2) after placement of three equidistant stay sutures. The macroscopic aspect and patency are evaluated at different post-operative intervals. Vessel histology is performed at 15, 21, and 30 days after the procedure. These anastomoses reveal minimal thermal damage in the adventitial layer only at depth of 200 micrometer. No medial or intimal thermal damage is identified. No thrombosis is observed, giving a permeability of 100% for both arteries and veins. The mean clamping time is 9 plus or minus 3 min. For 1.9 micrometer, the water extinction length is 0.15 mm. The welded thickness is comparable to the extinction length of the wavelength giving consequently a weld strength of 4 multiplied by 106 dynes/cm2 comparable to the strength of suture repairs: 5 - 6 multiplied by 106 dynes/cm2. These findings suggest that a low-energy 1.9 micrometer diode laser has potential clinical application for anastomosis of small vessels.

  6. An End-To-End Test of A Simulated Nuclear Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    VanDyke, Melissa; Hrbud, Ivana; Goddfellow, Keith; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase I Space Fission Systems issues in it particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  7. A Bottom-up Route to a Chemically End-to-End Assembly of Nanocellulose Fibers.

    PubMed

    Yang, Han; van de Ven, Theo G M

    2016-06-13

    In this work, we take advantage of the rod-like structure of electrosterically stabilized nanocrystalline cellulose (ENCC, with a width of about 7 nm and a length of about 130 nm), which has dicarboxylated cellulose (DCC) chains protruding from both ends, providing electrosterical stability for ENCC particles, to chemically end-to-end assemble these particles into nanocellulose fibers. ENCC with shorter DCC chains can be obtained by a mild hydrolysis of ENCC with HCl, and subsequently the hydrolyzed ENCC (HENCC, with a width of about 6 nm and a length of about 120 nm) is suitable to be assembled into high aspect ratio nanofibers by chemically cross-linking HENCC from one end to another. Two sets of HENCC were prepared by carbodiimide-mediated formation of an alkyne and an azide derivative, respectively. Cross-linking these two sets of HENCC was performed by a click reaction. HENCCs were also end-to-end cross-linked by a bioconjugation reaction, with a diamine. From atomic force microscopy (AFM) images, about ten HENCC nanoparticles were cross-linked and formed high aspect ratio nanofibers with a width of about 6 nm and a length of more than 1 μm. PMID:27211496

  8. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    Launched by NASA on 6 March 2009, the Kepler Mission has been observing more than 100,000 targets in a single patch of sky between the constellations Cygnus and Lyra almost continuously for the last two years looking for planetary systems using the transit method. As of October 2011, the Kepler spacecraft has collected and returned to Earth just over 290 GB of data, identifying 1235 planet candidates with 25 of these candidates confirmed as planets via ground observation. Extracting the telltale signature of a planetary system from stellar photometry where valid signal transients can be small as a 40 ppm is a difficult and exacting task. The end-to end processing of determining planetary candidates from noisy, raw photometric measurements is discussed.

  9. End-to-End Network Simulation Using a Site-Specific Radio Wave Propagation Model

    SciTech Connect

    Djouadi, Seddik M; Kuruganti, Phani Teja; Nutaro, James J

    2013-01-01

    The performance of systems that rely on a wireless network depends on the propagation environment in which that network operates. To predict how these systems and their supporting networks will perform, simulations must take into consideration the propagation environment and how this effects the performance of the wireless network. Network simulators typically use empirical models of the propagation environment. However, these models are not intended for, and cannot be used, to predict a wireless system will perform in a specific location, e.g., in the center of a particular city or the interior of a specific manufacturing facility. In this paper, we demonstrate how a site-specific propagation model and the NS3 simulator can be used to predict the end-to-end performance of a wireless network.

  10. End-to-end communication test on variable length packet structures utilizing AOS testbed

    NASA Technical Reports Server (NTRS)

    Miller, Warner H.; Sank, V.; Fong, Wai; Miko, J.; Powers, M.; Folk, John; Conaway, B.; Michael, K.; Yeh, Pen-Shu

    1994-01-01

    This paper describes a communication test, which successfully demonstrated the transfer of losslessly compressed images in an end-to-end system. These compressed images were first formatted into variable length Consultative Committee for Space Data Systems (CCSDS) packets in the Advanced Orbiting System Testbed (AOST). The CCSDS data Structures were transferred from the AOST to the Radio Frequency Simulations Operations Center (RFSOC), via a fiber optic link, where data was then transmitted through the Tracking and Data Relay Satellite System (TDRSS). The received data acquired at the White Sands Complex (WSC) was transferred back to the AOST where the data was captured and decompressed back to the original images. This paper describes the compression algorithm, the AOST configuration, key flight components, data formats, and the communication link characteristics and test results.

  11. End to end numerical simulations of the MAORY multiconjugate adaptive optics system

    NASA Astrophysics Data System (ADS)

    Arcidiacono, C.; Schreiber, L.; Bregoli, G.; Diolaiti, E.; Foppiani, I.; Cosentino, G.; Lombini, M.; Butler, R. C.; Ciliegi, P.

    2014-08-01

    MAORY is the adaptive optics module of the E-ELT that will feed the MICADO imaging camera through a gravity invariant exit port. MAORY has been foreseen to implement MCAO correction through three high order deformable mirrors driven by the reference signals of six Laser Guide Stars (LGSs) feeding as many Shack- Hartmann Wavefront Sensors. A three Natural Guide Stars (NGSs) system will provide the low order correction. We develop a code for the end-to-end simulation of the MAORY adaptive optics (AO) system in order to obtain high-fidelity modeling of the system performance. It is based on the IDL language and makes extensively uses of the GPUs. Here we present the architecture of the simulation tool and its achieved and expected performance.

  12. End-to-end assessment of a large aperture segmented ultraviolet optical infrared (UVOIR) telescope architecture

    NASA Astrophysics Data System (ADS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Olivier; Stark, Chris; Arenberg, Jon

    2016-07-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield exo-earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an exo-earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and exo-earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling these missions.

  13. Enhancing End-to-End Performance of Information Services Over Ka-Band Global Satellite Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul B.; Glover, Daniel R.; Ivancic, William D.; vonDeak, Thomas C.

    1997-01-01

    The Internet has been growing at a rapid rate as the key medium to provide information services such as e-mail, WWW and multimedia etc., however its global reach is limited. Ka-band communication satellite networks are being developed to increase the accessibility of information services via the Internet at global scale. There is need to assess satellite networks in their ability to provide these services and interconnect seamlessly with existing and proposed terrestrial telecommunication networks. In this paper the significant issues and requirements in providing end-to-end high performance for the delivery of information services over satellite networks based on various layers in the OSI reference model are identified. Key experiments have been performed to evaluate the performance of digital video and Internet over satellite-like testbeds. The results of the early developments in ATM and TCP protocols over satellite networks are summarized.

  14. End-to-end performance measurement of Internet based medical applications.

    PubMed Central

    Dev, P.; Harris, D.; Gutierrez, D.; Shah, A.; Senger, S.

    2002-01-01

    We present a method to obtain an end-to-end characterization of the performance of an application over a network. This method is not dependent on any specific application or type of network. The method requires characterization of network parameters, such as latency and packet loss, between the expected server or client endpoints, as well as characterization of the application's constraints on these parameters. A subjective metric is presented that integrates these characterizations and that operates over a wide range of applications and networks. We believe that this method may be of wide applicability as research and educational applications increasingly make use of computation and data servers that are distributed over the Internet. PMID:12463816

  15. Sutureless vascular end-to-end anastomosis. Final technical report Jan 82-Dec 83

    SciTech Connect

    Wozniak, J.J.

    1984-03-22

    The objective of this project was to develop a means of rejoining severed vessels (end-to-end anastomosis) without using sutures. Two essential elements in the concept, an instrument to evert the vessel and a biocompatible, low-temperature (130 F/54 C), heat-shrinkage sleeve were developed. The sleeve, which contracts to accomplish the anastomosis, was developed by crosslinking (with ionizing gamma radiation) synthetic trans-1,4 polyisoprene. The crosslinked polymer was subjected to an acute toxicity screening program and proved to be highly biocompatible. The sutureless anastomosis technique was tested in-vitro on freshly excised pig carotid arteries however, there was insufficient funding available to provide for an evaluation of the technique in laboratory animals.

  16. End-to-end interoperability and workflows from building architecture design to one or more simulations

    DOEpatents

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  17. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  18. End-to-end operations at the National Radio Astronomy Observatory

    NASA Astrophysics Data System (ADS)

    Radziwill, Nicole M.

    2008-07-01

    In 2006 NRAO launched a formal organization, the Office of End to End Operations (OEO), to broaden access to its instruments (VLA/EVLA, VLBA, GBT and ALMA) in the most cost-effective ways possible. The VLA, VLBA and GBT are mature instruments, and the EVLA and ALMA are currently under construction, which presents unique challenges for integrating software across the Observatory. This article 1) provides a survey of the new developments over the past year, and those planned for the next year, 2) describes the business model used to deliver many of these services, and 3) discusses the management models being applied to ensure continuous innovation in operations, while preserving the flexibility and autonomy of telescope software development groups.

  19. End-to-end automated microfluidic platform for synthetic biology: from design to functional analysis

    DOE PAGESBeta

    Linshiz, Gregory; Jensen, Erik; Stawski, Nina; Bi, Changhao; Elsbree, Nick; Jiao, Hong; Kim, Jungkyu; Mathies, Richard; Keasling, Jay D.; Hillson, Nathan J.

    2016-02-02

    Synthetic biology aims to engineer biological systems for desired behaviors. The construction of these systems can be complex, often requiring genetic reprogramming, extensive de novo DNA synthesis, and functional screening. Here, we present a programmable, multipurpose microfluidic platform and associated software and apply the platform to major steps of the synthetic biology research cycle: design, construction, testing, and analysis. We show the platform’s capabilities for multiple automated DNA assembly methods, including a new method for Isothermal Hierarchical DNA Construction, and for Escherichia coli and Saccharomyces cerevisiae transformation. The platform enables the automated control of cellular growth, gene expression induction, andmore » proteogenic and metabolic output analysis. Finally, taken together, we demonstrate the microfluidic platform’s potential to provide end-to-end solutions for synthetic biology research, from design to functional analysis.« less

  20. End-to-End Assessment of a Large Aperture Segmented Ultraviolet Optical Infrared (UVOIR) Telescope Architecture

    NASA Technical Reports Server (NTRS)

    Feinberg, Lee; Rioux, Norman; Bolcar, Matthew; Liu, Alice; Guyon, Oliver; Stark, Chris; Arenberg, Jon

    2016-01-01

    Key challenges of a future large aperture, segmented Ultraviolet Optical Infrared (UVOIR) Telescope capable of performing a spectroscopic survey of hundreds of Exoplanets will be sufficient stability to achieve 10^-10 contrast measurements and sufficient throughput and sensitivity for high yield Exo-Earth spectroscopic detection. Our team has collectively assessed an optimized end to end architecture including a high throughput coronagraph capable of working with a segmented telescope, a cost-effective and heritage based stable segmented telescope, a control architecture that minimizes the amount of new technologies, and an Exo-Earth yield assessment to evaluate potential performance. These efforts are combined through integrated modeling, coronagraph evaluations, and Exo-Earth yield calculations to assess the potential performance of the selected architecture. In addition, we discusses the scalability of this architecture to larger apertures and the technological tall poles to enabling it.

  1. End-to-end system test for solid-state microdosemeters.

    PubMed

    Pisacane, V L; Dolecek, Q E; Malak, H; Dicello, J F

    2010-08-01

    The gold standard in microdosemeters has been the tissue equivalent proportional counter (TEPC) that utilises a gas cavity. An alternative is the solid-state microdosemeter that replaces the gas with a condensed phase (silicon) detector with microscopic sensitive volumes. Calibrations of gas and solid-state microdosemeters are generally carried out using radiation sources built into the detector that impose restrictions on their handling, transportation and licensing in accordance with the regulations from international, national and local nuclear regulatory bodies. Here a novel method is presented for carrying out a calibration and end-to-end system test of a microdosemeter using low-energy photons as the initiating energy source, thus obviating the need for a regulated ionising radiation source. This technique may be utilised to calibrate both a solid-state microdosemeter and, with modification, a TEPC with the higher average ionisation energy of a gas.

  2. The End-to-End Pipeline for HST Slitless Spectra PHLAG

    NASA Astrophysics Data System (ADS)

    Kümmel, M.; Albrecht, R.; Fosbury, R.; Freudling, W.; Haase, J.; Hook, R. N.; Kuntschner, H.; Micol, A.; Rosa, M. R.; Walsh, J. R.

    The Space Telescope-European Coordinating Facility (ST-ECF) is undertaking a joint project with the Canadian Astronomy Data Centre and the Space Telescope Science Institute to build a Hubble Legacy Archive (HLA) that contains science ready high level data products to be used in the Virtual Observatory (VO). The ST-ECF will provide extracted slitless spectra to the HLA, and for this purpose has developed the Pipeline for Hubble Legacy Archive Grism data (PHLAG). PHLAG is an end-to-end pipeline that performs an unsupervised reduction of slitless data taken with the Advanced Camera for Surveys (ACS) or the Near Infrared Camera and Multi Object Spectrometer (NICMOS) and ingests the VO compatible spectra into the HLA. PHLAG is a modular pipeline, and the various modules and their roles are discussed. In a pilot study, PHLAG is applied to NICMOS data taken with the G141 grism, and the first results of a run on all available data are shown.

  3. Establishing end-to-end security in a nationwide network for telecooperation.

    PubMed

    Staemmler, Martin; Walz, Michael; Weisser, Gerald; Engelmann, Uwe; Weininger, Robert; Ernstberger, Antonio; Sturm, Johannes

    2012-01-01

    Telecooperation is used to support care for trauma patients by facilitating a mutual exchange of treatment and image data in use-cases such as emergency consultation, second-opinion, transfer, rehabilitation and out-patient aftertreatment. To comply with data protection legislation a two-factor authentication using ownership and knowledge has been implemented to assure personalized access rights. End-to-end security is achieved by symmetric encryption in combination with external trusted services which provide the symmetric key solely at runtime. Telecooperation partners may be chosen at departmental level but only individuals of that department, as a result of checking the organizational assignments maintained by LDAP services, are granted access. Data protection officers of a federal state have accepted the data protection means. The telecooperation platform is in routine operation and designed to serve for up to 800 trauma centers in Germany, organized in more than 50 trauma networks.

  4. End-to-End QoS for Differentiated Services and ATM Internetworking

    NASA Technical Reports Server (NTRS)

    Su, Hongjun; Atiquzzaman, Mohammed

    2001-01-01

    The Internet was initially design for non real-time data communications and hence does not provide any Quality of Service (QoS). The next generation Internet will be characterized by high speed and QoS guarantee. The aim of this paper is to develop a prioritized early packet discard (PEPD) scheme for ATM switches to provide service differentiation and QoS guarantee to end applications running over next generation Internet. The proposed PEPD scheme differs from previous schemes by taking into account the priority of packets generated from different application. We develop a Markov chain model for the proposed scheme and verify the model with simulation. Numerical results show that the results from the model and computer simulation are in close agreement. Our PEPD scheme provides service differentiation to the end-to-end applications.

  5. Development of a Dynamic, End-to-End Free Piston Stirling Convertor Model

    NASA Technical Reports Server (NTRS)

    Regan, Timothy F.; Gerber, Scott S.; Roth, Mary Ellen

    2004-01-01

    A dynamic model for a free-piston Stirling convertor is being developed at the NASA Glenn Research Center. The model is an end-to-end system model that includes the cycle thermodynamics, the dynamics, and electrical aspects of the system. The subsystems of interest are the heat source, the springs, the moving masses, the linear alternator, the controller, and the end-user load. The envisioned use of the model will be in evaluating how changes in a subsystem could affect the operation of the convertor. The model under development will speed the evaluation of improvements to a subsystem and aid in determining areas in which most significant improvements may be found. One of the first uses of the end-toend model will be in the development of controller architectures. Another related area is in evaluating changes to details in the linear alternator.

  6. Portable end-to-end ground system for low-cost mission support

    NASA Astrophysics Data System (ADS)

    Lam, Barbara

    1996-11-01

    This paper presents a revolutionary architecture of the end-to-end ground system to reduce overall mission support costs. The present ground system of the Jet Propulsion Laboratory (JPL) is costly to operate, maintain, deploy, reproduce, and document. In the present climate of shrinking NASA budgets, this proposed architecture takes on added importance as it should dramatically reduce all of the above costs. Currently, the ground support functions (i.e., receiver, tracking, ranging, telemetry, command, monitor and control) are distributed among several subsystems that are housed in individual rack-mounted chassis. These subsystems can be integrated into one portable laptop system using established Multi Chip Module (MCM) packaging technology and object-based software libraries. The large scale integration of subsystems into a small portable system connected to the World Wide Web (WWW) will greatly reduce operations, maintenance and reproduction costs. Several of the subsystems can be implemented using Commercial Off-The-Shelf (COTS) products further decreasing non-recurring engineering costs. The inherent portability of the system will open up new ways for using the ground system at the "point-of-use" site as opposed to maintaining several large centralized stations. This eliminates the propagation delay of the data to the Principal Investigator (PI), enabling the capture of data in real-time and performing multiple tasks concurrently from any location in the world. Sample applications are to use the portable ground system in remote areas or mobile vessels for real-time correlation of satellite data with earth-bound instruments; thus, allowing near real-time feedback and control of scientific instruments. This end-to-end portable ground system will undoubtedly create opportunities for better scientific observation and data acquisition.

  7. Integrating end-to-end threads of control into object-oriented analysis and design

    NASA Technical Reports Server (NTRS)

    Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.

    1993-01-01

    Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.

  8. Kinetics of end-to-end collision in short single-stranded nucleic acids.

    PubMed

    Wang, Xiaojuan; Nau, Werner M

    2004-01-28

    A novel fluorescence-based method, which entails contact quenching of the long-lived fluorescent state of 2,3-diazabicyclo[2.2.2]-oct-2-ene (DBO), was employed to measure the kinetics of end-to-end collision in short single-stranded oligodeoxyribonucleotides of the type 5'-DBO-(X)n-dG with X = dA, dC, dT, or dU and n = 2 or 4. The fluorophore was covalently attached to the 5' end and dG was introduced as an efficient intrinsic quencher at the 3' terminus. The end-to-end collision rates, which can be directly related to the efficiency of intramolecular fluorescence quenching, ranged from 0.1 to 9.0 x 10(6) s(-1). They were strongly dependent on the strand length, the base sequence, as well as the temperature. Oligonucleotides containing dA in the backbone displayed much slower collision rates and significantly higher positive activation energies than strands composed of pyrimidine bases, suggesting a higher intrinsic rigidity of oligoadenylate. Comparison of the measured collision rates in short single-stranded oligodeoxyribonucleotides with the previously reported kinetics of hairpin formation indicates that the intramolecular collision is significantly faster than the nucleation step of hairpin closing. This is consistent with the configurational diffusion model suggested by Ansari et al. (Ansari, A.; Kuznetsov, S. V.; Shen, Y. Proc.Natl. Acad. Sci. USA 2001, 98, 7771-7776), in which the formation of misfolded loops is thought to slow hairpin formation.

  9. Kinetics of end-to-end collision in short single-stranded nucleic acids.

    PubMed

    Wang, Xiaojuan; Nau, Werner M

    2004-01-28

    A novel fluorescence-based method, which entails contact quenching of the long-lived fluorescent state of 2,3-diazabicyclo[2.2.2]-oct-2-ene (DBO), was employed to measure the kinetics of end-to-end collision in short single-stranded oligodeoxyribonucleotides of the type 5'-DBO-(X)n-dG with X = dA, dC, dT, or dU and n = 2 or 4. The fluorophore was covalently attached to the 5' end and dG was introduced as an efficient intrinsic quencher at the 3' terminus. The end-to-end collision rates, which can be directly related to the efficiency of intramolecular fluorescence quenching, ranged from 0.1 to 9.0 x 10(6) s(-1). They were strongly dependent on the strand length, the base sequence, as well as the temperature. Oligonucleotides containing dA in the backbone displayed much slower collision rates and significantly higher positive activation energies than strands composed of pyrimidine bases, suggesting a higher intrinsic rigidity of oligoadenylate. Comparison of the measured collision rates in short single-stranded oligodeoxyribonucleotides with the previously reported kinetics of hairpin formation indicates that the intramolecular collision is significantly faster than the nucleation step of hairpin closing. This is consistent with the configurational diffusion model suggested by Ansari et al. (Ansari, A.; Kuznetsov, S. V.; Shen, Y. Proc.Natl. Acad. Sci. USA 2001, 98, 7771-7776), in which the formation of misfolded loops is thought to slow hairpin formation. PMID:14733555

  10. Verifying end-to-end system performance with the transformational information extraction model

    NASA Astrophysics Data System (ADS)

    Mauck, Alisha; Roszyk, Greg

    2006-05-01

    In the intelligence community, the volume of imagery data threatens to overwhelm the traditional process of information extraction. Satellite systems are capable of producing large quantities of imagery data every day. Traditionally, intelligence analysts have the arduous task of manually reviewing satellite imagery data and generating information products. In a time of increasing imagery data, this manual approach is not consistent with the goal of a timely and highly responsive system. These realities are key factors in Booz Allen Hamilton's transformational approach to information extraction. This approach employs information services and value added processes (VAP) to reduce the amount of data being manually reviewed. Booz Allen has utilized a specialization/generalization hierarchy to aggregate hundreds of thousands of imagery intelligence needs into sixteen information services. Information Services are automated by employing value added processes, which extract the information from the imagery data and generate information products. While the intelligence needs and information services remain relatively static in time, the VAP's have the ability to evolve rapidly with advancing technologies. The Booz Allen Transformational Information Extraction Model validates this automated approach by simulating realistic system parameters. The functional flow model includes image formation, three information services, six VAP's, and reduced manual intervention. Adjustable model variables for VAP time, VAP confidence, number of intelligence analyst, and time for analyst review provide a flexible framework for modeling different system cases. End-to-End system metrics such as intelligence need satisfaction, end-to-end timeliness, and sensitivity to number of analyst and VAP variables quantify the system performance.

  11. End-to-end test of spatial accuracy in Gamma Knife treatments for trigeminal neuralgia

    SciTech Connect

    Brezovich, Ivan A. Wu, Xingen; Duan, Jun; Popple, Richard A.; Shen, Sui; Benhabib, Sidi; Huang, Mi; Christian Dobelbower, M.; Fisher III, Winfield S.

    2014-11-01

    Purpose: Spatial accuracy is most crucial when small targets like the trigeminal nerve are treated. Although current quality assurance procedures typically verify that individual apparatus, like the MRI scanner, CT scanner, Gamma Knife, etc., are meeting specifications, the cumulative error of all equipment and procedures combined may exceed safe margins. This study uses an end-to-end approach to assess the overall targeting errors that may have occurred in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 in.) diameter MRI-contrast filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the location of the cavity matches the Gamma Knife coordinates of an arbitrarily chosen, previously treated patient. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pinprick to identify the cavity center. Treatments are planned for radiation delivery with 4 mm collimators according to MRI and CT scans using the clinical localizer boxes and acquisition protocols. Shots are planned so that the 50% isodose surface encompasses the cavity. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pinprick, which represents the intended target, and the centroid of the 50% isodose line, which is the center of the radiation field that was actually delivered. Results: Averaged over ten patient simulations, targeting errors along the x, y, and z coordinates (patient’s left-to-right, posterior-to-anterior, and head-to-foot) were, respectively, −0.060 ± 0.363, −0.350 ± 0.253, and 0.348 ± 0.204 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely, 0.109 ± 0.167, −0.191 ± 0.144, and 0.211 ± 0.094 mm. The largest errors along individual axes in MRI

  12. Internet end-to-end performance monitoring for the High Energy Nuclear and Particle Physics community

    SciTech Connect

    Matthews, W.

    2000-02-22

    Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER project has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project

  13. Towards end-to-end models for investigating the effects of climate and fishing in marine ecosystems

    NASA Astrophysics Data System (ADS)

    Travers, M.; Shin, Y.-J.; Jennings, S.; Cury, P.

    2007-12-01

    End-to-end models that represent ecosystem components from primary producers to top predators, linked through trophic interactions and affected by the abiotic environment, are expected to provide valuable tools for assessing the effects of climate change and fishing on ecosystem dynamics. Here, we review the main process-based approaches used for marine ecosystem modelling, focusing on the extent of the food web modelled, the forcing factors considered, the trophic processes represented, as well as the potential use and further development of the models. We consider models of a subset of the food web, models which represent the first attempts to couple low and high trophic levels, integrated models of the whole ecosystem, and size spectrum models. Comparisons within and among these groups of models highlight the preferential use of functional groups at low trophic levels and species at higher trophic levels and the different ways in which the models account for abiotic processes. The model comparisons also highlight the importance of choosing an appropriate spatial dimension for representing organism dynamics. Many of the reviewed models could be extended by adding components and by ensuring that the full life cycles of species components are represented, but end-to-end models should provide full coverage of ecosystem components, the integration of physical and biological processes at different scales and two-way interactions between ecosystem components. We suggest that this is best achieved by coupling models, but there are very few existing cases where the coupling supports true two-way interaction. The advantages of coupling models are that the extent of discretization and representation can be targeted to the part of the food web being considered, making their development time- and cost-effective. Processes such as predation can be coupled to allow the propagation of forcing factors effects up and down the food web. However, there needs to be a stronger focus

  14. End-To-End Simulation of Launch Vehicle Trajectories Including Stage Separation Dynamics

    NASA Technical Reports Server (NTRS)

    Albertson, Cindy W.; Tartabini, Paul V.; Pamadi, Bandu N.

    2012-01-01

    The development of methodologies, techniques, and tools for analysis and simulation of stage separation dynamics is critically needed for successful design and operation of multistage reusable launch vehicles. As a part of this activity, the Constraint Force Equation (CFE) methodology was developed and implemented in the Program to Optimize Simulated Trajectories II (POST2). The objective of this paper is to demonstrate the capability of POST2/CFE to simulate a complete end-to-end mission. The vehicle configuration selected was the Two-Stage-To-Orbit (TSTO) Langley Glide Back Booster (LGBB) bimese configuration, an in-house concept consisting of a reusable booster and an orbiter having identical outer mold lines. The proximity and isolated aerodynamic databases used for the simulation were assembled using wind-tunnel test data for this vehicle. POST2/CFE simulation results are presented for the entire mission, from lift-off, through stage separation, orbiter ascent to orbit, and booster glide back to the launch site. Additionally, POST2/CFE stage separation simulation results are compared with results from industry standard commercial software used for solving dynamics problems involving multiple bodies connected by joints.

  15. SPOKES: An end-to-end simulation facility for spectroscopic cosmological surveys

    NASA Astrophysics Data System (ADS)

    Nord, B.; Amara, A.; Réfrégier, A.; Gamper, La.; Gamper, Lu.; Hambrecht, B.; Chang, C.; Forero-Romero, J. E.; Serrano, S.; Cunha, C.; Coles, O.; Nicola, A.; Busha, M.; Bauer, A.; Saunders, W.; Jouvel, S.; Kirk, D.; Wechsler, R.

    2016-04-01

    The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.

  16. Forming End-to-End Oligomers of Gold Nanorods Using Porphyrins and Phthalocyanines.

    PubMed

    Stewart, Alexander F; Gagnon, Brandon P; Walker, Gilbert C

    2015-06-23

    The illumination of aggregated metal nanospecies can create strong local electric fields to brighten Raman scattering. This study describes a procedure to self-assemble gold nanorods (NRs) through the use of porphyrin and phthalocyanine agents to create reproducibly stable and robust NR aggregates in the form of end-to-end oligomers. Narrow inter-rod gaps result, creating electric field "hot spots" between the NRs. The organic linker molecules themselves are potential Raman-based optical labels, and the result is significant numbers of Raman-active species located in the hot spots. NR polymerization was quenched by phospholipid encapsulation, which allows for control of the polydispersity of the aggregate solution, to optimize the surface-enhanced Raman scattering (SERS) enhancement and permitted the aqueous solubility of the aggregates. The increased presence of Raman-active species in the hot spots and the optimizing of solution polydispersity resulted in the observation of scattering enhancements by encapsulated porphyrins/phthalocyanines of up to 3500-fold over molecular chromophores lacking the NR oligomer host.

  17. Development of an End-to-End Model for Free-Space Optical Communications

    NASA Astrophysics Data System (ADS)

    Hemmati, H.

    2005-05-01

    Through funding by NASA's Exploration Systems Research and Technology (ESR&T) Program and the Advanced Space Technology Program (ASTP), a team, including JPL, Boeing, NASA-Glenn, and the Georgia Institute of Technology, will develop an end-to-end modeling tool for rapid architecture trade-offs of high-data-rate laser communications from lunar, martian, and outer planetary ranges. An objective of the modeling tool is to reduce the inefficient reliance on modeling of discrete subsystems or sequential development of multiple expensive and time-consuming hardware units, thereby saving significant cost and time. This dynamic, time-domain modeling tool will accept measured component and subsystem data inputs and generate "difficult to measure" characteristics required for the performance evaluation of different designs and architectural choices. The planned modeling tool will incorporate actual subsystem performance data to reduce the develop-build-evaluate-refine production cycle. The list of high-level objectives of the program includes (1) development of a bidirectional global link analysis backbone software encompassing all optical communication subsystem parameters; (2) development of a bidirectional global link simulation model encompassing all optical communication parameters; (3) interoperability of the link analysis tool with all relevant detailed subsystem design models; and (4) a validated model that is validated against known experimental data at the subsystem and system levels.

  18. Engineered salt-insensitive alpha-defensins with end-to-end circularized structures.

    PubMed

    Yu, Q; Lehrer, R I; Tam, J P

    2000-02-11

    We designed a retro-isomer and seven circularized "beta-tile" peptide analogs of a typical rabbit alpha-defensin, NP-1. The analogs retained defensin-like architecture after the characteristic end-to-end, Cys(3,31) (C I:C VI), alpha-defensin disulfide bond was replaced by a backbone peptide bond. The retro-isomer of NP-1 was as active as the parent compound, suggesting that overall topology and amphipathicity governed its antimicrobial activity. A beta-tile design with or without a single cross-bracing disulfide bond sufficed for antimicrobial activity, and some of the analogs retained activity against Escherichia coli and Salmonella typhimurium in NaCl concentrations that rendered NP-1 inactive. The new molecules had clustered positive charges resembling those in protegrins and tachyplesins, but were less cytotoxic. Such simplified alpha-defensin analogs minimize problems encountered during the oxidative folding of three-disulfide defensins. In addition, they are readily accessible to a novel thia zip cyclization procedure applicable to large unprotected peptide precursors of 31 amino acids in aqueous solutions. Collectively, these findings provide new and improved methodology to create salt-insensitive defensin-like peptides for application against bacterial diseases. PMID:10660548

  19. End-to-end differential contactless conductivity sensor for microchip capillary electrophoresis.

    PubMed

    Fercher, Georg; Haller, Anna; Smetana, Walter; Vellekoop, Michael J

    2010-04-15

    In this contribution, a novel measurement approach for miniaturized capillary electrophoresis (CE) devices is presented: End-to-end differential capacitively coupled contactless conductivity measurement. This measurement technique is applied to a miniaturized CE device fabricated in low-temperature cofired ceramics (LTCC) multilayer technology. The working principle is based on the placement of two distinct detector areas near both ends of the fluid inlet and outlet of the separation channel. Both output signals are subtracted from each other, and the resulting differential signal is amplified and measured. This measurement approach has several advantages over established, single-end detectors: The high baseline level resulting from parasitic stray capacitance and buffer conductivity is reduced, leading to better signal-to-noise ratio and hence higher measurement sensitivity. Furthermore, temperature and, thus, baseline drift effects are diminished owing to the differentiating nature of the system. By comparing the peak widths measured with both detectors, valuable information about zone dispersion effects arising during the separation is obtained. Additionally, the novel measurement scheme allows the determination of dispersion effects that occur at the time of sample injection. Optical means of dispersion evaluation are ineffective because of the opaque LTCC substrate. Electrophoretic separation experiments of inorganic ions show sensitivity enhancements by about a factor of 30-60 compared to the single-end measurement scheme. PMID:20337422

  20. Advanced end-to-end fiber optic sensing systems for demanding environments

    NASA Astrophysics Data System (ADS)

    Black, Richard J.; Moslehi, Behzad

    2010-09-01

    Optical fibers are small-in-diameter, light-in-weight, electromagnetic-interference immune, electrically passive, chemically inert, flexible, embeddable into different materials, and distributed-sensing enabling, and can be temperature and radiation tolerant. With appropriate processing and/or packaging, they can be very robust and well suited to demanding environments. In this paper, we review a range of complete end-to-end fiber optic sensor systems that IFOS has developed comprising not only (1) packaged sensors and mechanisms for integration with demanding environments, but (2) ruggedized sensor interrogators, and (3) intelligent decision aid algorithms software systems. We examine the following examples: " Fiber Bragg Grating (FBG) optical sensors systems supporting arrays of environmentally conditioned multiplexed FBG point sensors on single or multiple optical fibers: In conjunction with advanced signal processing, decision aid algorithms and reasoners, FBG sensor based structural health monitoring (SHM) systems are expected to play an increasing role in extending the life and reducing costs of new generations of aerospace systems. Further, FBG based structural state sensing systems have the potential to considerably enhance the performance of dynamic structures interacting with their environment (including jet aircraft, unmanned aerial vehicles (UAVs), and medical or extravehicular space robots). " Raman based distributed temperature sensing systems: The complete length of optical fiber acts as a very long distributed sensor which may be placed down an oil well or wrapped around a cryogenic tank.

  1. The stapled functional end-to-end anastomosis following colonic resection.

    PubMed

    Kyzer, S; Gordon, P H

    1992-09-01

    To determine the results of our experience with the use of staples for construction of anastomoses following colonic resection, a series of 223 anastomoses performed in 205 patients was reviewed. Indications for operation included malignancy, benign neoplasms, inflammatory bowel disease, and several miscellaneous entities. A functional end-to-end anastomosis using the standard GIA cartridge and the TA 55 instruments was performed. The operative mortality was 1.5% with none of the deaths related to the anastomosis. Intraoperative complications encountered included bleeding (21), leak (1), tissue fracture (1), instrument failure (4), and technical error (3). Early postoperative complications related or potentially related to the anastomosis included bleeding (5), pelvic abscess (1), fistula (1), peritonitis (2), ischemia of anastomosis (1). Late complications included five patients with small bowel obstruction, two of whom required operation. Anastomotic recurrences developed in 5.9% of patients. Our experience gained with stapling instruments has shown them to be a reliable method for performing anastomoses in the colon in a safe and expeditious manner. PMID:1402308

  2. Telemetry Ranging: Laboratory Validation Tests and End-to-End Performance

    NASA Astrophysics Data System (ADS)

    Hamkins, J.; Kinman, P.; Xie, H.; Vilnrotter, V.; Dolinar, S.; Adams, N.; Sanchez, E.; Millard, W.

    2016-08-01

    This article reports on a set of laboratory tests of telemetry ranging conducted at Development Test Facility 21 (DTF-21) in Monrovia, California. An uplink pseudorandom noise (PN) ranging signal was generated by DTF-21, acquired by the Frontier Radio designed and built at the Johns Hopkins University Applied Physics Laboratory, and downlink telemetry frames from the radio were recorded by an open-loop receiver. In four of the tests, the data indicate that telemetry ranging can resolve the two-way time delay to a standard deviation of 2.1-3.4 ns, corresponding to about 30 to 51 cm in (one-way) range accuracy, when 30 s averaging of timing estimates is used. Other tests performed worse because of unsatisfactory receiver sampling rate, quantizer resolution, dc bias, improper configuration, or other reasons. The article also presents an analysis of the expected end-to-end performance of the telemetry ranging system. In one case considered, the theoretically-predicted performance matches the test results, within 10 percent, which provides a reasonable validation that the expected performance was achieved by the test. The analysis also shows that in one typical ranging scenario, one-way range accuracy of 1 m can be achieved with telemetry ranging when the data rate is above 2 kbps.

  3. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites

    PubMed Central

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G.

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis. PMID:26901845

  4. A Workflow-based Intelligent Network Data Movement Advisor with End-to-end Performance Optimization

    SciTech Connect

    Zhu, Michelle M.; Wu, Chase Q.

    2013-11-07

    Next-generation eScience applications often generate large amounts of simulation, experimental, or observational data that must be shared and managed by collaborative organizations. Advanced networking technologies and services have been rapidly developed and deployed to facilitate such massive data transfer. However, these technologies and services have not been fully utilized mainly because their use typically requires significant domain knowledge and in many cases application users are even not aware of their existence. By leveraging the functionalities of an existing Network-Aware Data Movement Advisor (NADMA) utility, we propose a new Workflow-based Intelligent Network Data Movement Advisor (WINDMA) with end-to-end performance optimization for this DOE funded project. This WINDMA system integrates three major components: resource discovery, data movement, and status monitoring, and supports the sharing of common data movement workflows through account and database management. This system provides a web interface and interacts with existing data/space management and discovery services such as Storage Resource Management, transport methods such as GridFTP and GlobusOnline, and network resource provisioning brokers such as ION and OSCARS. We demonstrate the efficacy of the proposed transport-support workflow system in several use cases based on its implementation and deployment in DOE wide-area networks.

  5. End-to-end performance modeling of passive remote sensing systems

    SciTech Connect

    Smith, B.W.; Borel, C.C.; Clodius, W.B.; Theiler, J.; Laubscher, B.; Weber, P.G.

    1996-07-01

    The ultimate goal of end-to-end system modeling is to simulate all known physical effects which determine the content of the data, before flying an instrument system. In remote sensing, one begins with a scene, viewed either statistically or dynamically, computes the radiance in each spectral band, renders the scene, transfers it through representative atmospheres to create the radiance field at an aperture, and integrates over sensor pixels. We have simulated a comprehensive sequence of realistic instrument hardware elements and the transfer of simulated data to an analysis system. This analysis package is the same as that intended for use of data collections from the real system. By comparing the analyzed image to the original scene, the net effect of nonideal system components can be understood. Iteration yields the optimum values of system parameters to achieve performance targets. We have used simulation to develop and test improved multispectral algorithms for (1) the robust retrieval of water surface temperature, water vapor column, and other quantities; (2) the preservation of radiometric accuracy during atmospheric correction and pixel registration on the ground; and (3) exploitation of on-board multispectral measurements to assess the atmosphere between ground and aperture.

  6. End-to-end simulation of bunch merging for a muon collider

    SciTech Connect

    Bao, Yu; Stratakis, Diktys; Hanson, Gail G.; Palmer, Robert B.

    2015-05-03

    Muon accelerator beams are commonly produced indirectly through pion decay by interaction of a charged particle beam with a target. Efficient muon capture requires the muons to be first phase-rotated by rf cavities into a train of 21 bunches with much reduced energy spread. Since luminosity is proportional to the square of the number of muons per bunch, it is crucial for a Muon Collider to use relatively few bunches with many muons per bunch. In this paper we will describe a bunch merging scheme that should achieve this goal. We present for the first time a complete end-to-end simulation of a 6D bunch merger for a Muon Collider. The 21 bunches arising from the phase-rotator, after some initial cooling, are merged in longitudinal phase space into seven bunches, which then go through seven paths with different lengths and reach the final collecting "funnel" at the same time. The final single bunch has a transverse and a longitudinal emittance that matches well with the subsequent 6D rectilinear cooling scheme.

  7. End to End Digitisation and Analysis of Three-Dimensional Coral Models, from Communities to Corallites.

    PubMed

    Gutierrez-Heredia, Luis; Benzoni, Francesca; Murphy, Emma; Reynaud, Emmanuel G

    2016-01-01

    Coral reefs hosts nearly 25% of all marine species and provide food sources for half a billion people worldwide while only a very small percentage have been surveyed. Advances in technology and processing along with affordable underwater cameras and Internet availability gives us the possibility to provide tools and softwares to survey entire coral reefs. Holistic ecological analyses of corals require not only the community view (10s to 100s of meters), but also the single colony analysis as well as corallite identification. As corals are three-dimensional, classical approaches to determine percent cover and structural complexity across spatial scales are inefficient, time-consuming and limited to experts. Here we propose an end-to-end approach to estimate these parameters using low-cost equipment (GoPro, Canon) and freeware (123D Catch, Meshmixer and Netfabb), allowing every community to participate in surveys and monitoring of their coral ecosystem. We demonstrate our approach on 9 species of underwater colonies in ranging size and morphology. 3D models of underwater colonies, fresh samples and bleached skeletons with high quality texture mapping and detailed topographic morphology were produced, and Surface Area and Volume measurements (parameters widely used for ecological and coral health studies) were calculated and analysed. Moreover, we integrated collected sample models with micro-photogrammetry models of individual corallites to aid identification and colony and polyp scale analysis. PMID:26901845

  8. An end-to-end analysis of drought from smallholder farms in southwest Jamaica

    NASA Astrophysics Data System (ADS)

    Curtis, W. R. S., III; Gamble, D. W.; Popke, J.

    2015-12-01

    Drought can be defined in many ways: meteorological, hydrological, agricultural, and socio-economic. Another way to approach drought is from a "perception" perspective, where individuals whose livelihood is highly dependent on precipitation take adaptive actions. In this study we use two-years of data collected from twelve smallholder farms in southern St. Elizabeth, Jamaica to undertake an end-to-end analysis of drought. At each farm, 6-hour temperature and soil moisture, and tipping-bucket rainfall were recorded from June 2013 to June 2015, and twice-monthly farmers indicated whether they were experiencing drought and if they irrigated (hand-watering, drip irrigation, or pipe and sprinkler). In many cases half of the farmers considered themselves in a drought, while the others not, even though the largest separation among farms was about 20 km. This study will use analysis of variance to test the following hypotheses: Drought perception is related to a) absolute amounts of precipitation at the time, b) other environmental cues at the time (soil moisture, temperature), or c) relative amounts of precipitation as compared to the same time last year. Irrigation actions and water use following the perception of drought will also be examined.

  9. Optimization and automation of an end-to-end high throughput microscale transient protein production process.

    PubMed

    Bos, Aaron B; Luan, Peng; Duque, Joseph N; Reilly, Dorothea; Harms, Peter D; Wong, Athena W

    2015-09-01

    High throughput protein production from transient transfection of mammalian cells is used in multiple facets of research and development studies. Commonly used formats for these high number expressions are 12-, 24- and 96-well plates at various volumes. However there are no published examples of a 96-deep well plate microscale (1,000 μL) suspension process for mammalian transient expression. For this reason, we aimed to determine the optimal operating conditions for a high producing, microscale HEK293 transient system. We evaluated the hydrodynamic flow and measured the oxygen transfer rate (OTR) and transient protein expression for 96-deep well plates of different well geometries filled at 600-1,000 μL working volumes and agitated at various speeds and orbital diameters. Ultimately, a round well-round bottom (RR) 96-deep well plate with a working volume of 1,000 µL agitated at 1,000 RPM and a 3 mm orbital diameter yielded the highest and most consistent total transient protein production. As plate cultures are subject to evaporation, water loss from different plate seals was measured to identify an optimal plate sealing method. Finally, to enable higher capacity protein production, both expression and purification processes were automated. Functionality of this end-to-end automation workflow was demonstrated with the generation of high levels of human IgG1 antibodies (≥360 µg/mL) with reproducible productivity, product quality and ≥78% purification recovery.

  10. Availability and End-to-end Reliability in Low Duty Cycle Multihop Wireless Sensor Networks

    PubMed Central

    Suhonen, Jukka; Hämäläinen, Timo D.; Hännikäinen, Marko

    2009-01-01

    A wireless sensor network (WSN) is an ad-hoc technology that may even consist of thousands of nodes, which necessitates autonomic, self-organizing and multihop operations. A typical WSN node is battery powered, which makes the network lifetime the primary concern. The highest energy efficiency is achieved with low duty cycle operation, however, this alone is not enough. WSNs are deployed for different uses, each requiring acceptable Quality of Service (QoS). Due to the unique characteristics of WSNs, such as dynamic wireless multihop routing and resource constraints, the legacy QoS metrics are not feasible as such. We give a new definition to measure and implement QoS in low duty cycle WSNs, namely availability and reliability. Then, we analyze the effect of duty cycling for reaching the availability and reliability. The results are obtained by simulations with ZigBee and proprietary TUTWSN protocols. Based on the results, we also propose a data forwarding algorithm suitable for resource constrained WSNs that guarantees end-to-end reliability while adding a small overhead that is relative to the packet error rate (PER). The forwarding algorithm guarantees reliability up to 30% PER. PMID:22574002

  11. Telecommunications end-to-end systems monitoring on TOPEX/Poseidon: Tools and techniques

    NASA Technical Reports Server (NTRS)

    Calanche, Bruno J.

    1994-01-01

    The TOPEX/Poseidon Project Satellite Performance Analysis Team's (SPAT) roles and responsibilities have grown to include functions that are typically performed by other teams on JPL Flight Projects. In particular, SPAT Telecommunication's role has expanded beyond the nominal function of monitoring, assessing, characterizing, and trending the spacecraft (S/C) RF/Telecom subsystem to one of End-to-End Information Systems (EEIS) monitoring. This has been accomplished by taking advantage of the spacecraft and ground data system structures and protocols. By processing both the received spacecraft telemetry minor frame ground generated CRC flags and NASCOM block poly error flags, bit error rates (BER) for each link segment can be determined. This provides the capability to characterize the separate link segments, determine science data recovery, and perform fault/anomaly detection and isolation. By monitoring and managing the links, TOPEX has successfully recovered approximately 99.9 percent of the science data with an integrity (BER) of better than 1 x 10(exp 8). This paper presents the algorithms used to process the above flags and the techniques used for EEIS monitoring.

  12. Semantic Complex Event Processing over End-to-End Data Flows

    SciTech Connect

    Zhou, Qunzhi; Simmhan, Yogesh; Prasanna, Viktor K.

    2012-04-01

    Emerging Complex Event Processing (CEP) applications in cyber physical systems like SmartPower Grids present novel challenges for end-to-end analysis over events, flowing from heterogeneous information sources to persistent knowledge repositories. CEP for these applications must support two distinctive features - easy specification patterns over diverse information streams, and integrated pattern detection over realtime and historical events. Existing work on CEP has been limited to relational query patterns, and engines that match events arriving after the query has been registered. We propose SCEPter, a semantic complex event processing framework which uniformly processes queries over continuous and archived events. SCEPteris built around an existing CEP engine with innovative support for semantic event pattern specification and allows their seamless detection over past, present and future events. Specifically, we describe a unified semantic query model that can operate over data flowing through event streams to event repositories. Compile-time and runtime semantic patterns are distinguished and addressed separately for efficiency. Query rewriting is examined and analyzed in the context of temporal boundaries that exist between event streams and their repository to avoid duplicate or missing results. The design and prototype implementation of SCEPterare analyzed using latency and throughput metrics for scenarios from the Smart Grid domain.

  13. SPoRT - An End-to-End R2O Activity

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.

    2009-01-01

    Established in 2002 to demonstrate the weather and forecasting application of real-time EOS measurements, the Short-term Prediction Research and Transition (SPoRT) program has grown to be an end-to-end research to operations activity focused on the use of advanced NASA modeling and data assimilation approaches, nowcasting techniques, and unique high-resolution multispectral observational data applications from EOS satellites to improve short-term weather forecasts on a regional and local scale. SPoRT currently partners with several universities and other government agencies for access to real-time data and products, and works collaboratively with them and operational end users at 13 WFOs to develop and test the new products and capabilities in a "test-bed" mode. The test-bed simulates key aspects of the operational environment without putting constraints on the forecaster workload. Products and capabilities which show utility in the test-bed environment are then transitioned experimentally into the operational environment for further evaluation and assessment. SPoRT focuses on a suite of data and products from MODIS, AMSR-E, and AIRS on the NASA Terra and Aqua satellites, and total lightning measurements from ground-based networks. Some of the observations are assimilated into or used with various versions of the WRF model to provide supplemental forecast guidance to operational end users. SPoRT is enhancing partnerships with NOAA / NESDIS for new product development and data access to exploit the remote sensing capabilities of instruments on the NPOESS satellites to address short term weather forecasting problems. The VIIRS and CrIS instruments on the NPP and follow-on NPOESS satellites provide similar observing capabilities to the MODIS and AIRS instruments on Terra and Aqua. SPoRT will be transitioning existing and new capabilities into the AWIIPS II environment to continue the continuity of its activities.

  14. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    PubMed

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  15. End-To-END Performance of the Future MOMA Instrument Aboard the ExoMars Mission

    NASA Astrophysics Data System (ADS)

    Pinnick, V. T.; Buch, A.; Szopa, C.; Grand, N.; Danell, R.; Grubisic, A.; van Amerom, F. H. W.; Glavin, D. P.; Freissinet, C.; Coll, P. J.; Stalport, F.; Humeau, O.; Arevalo, R. D., Jr.; Brinckerhoff, W. B.; Steininger, H.; Goesmann, F.; Raulin, F.; Mahaffy, P. R.

    2015-12-01

    Following the SAM experiment aboard the Curiosity rover, the Mars Organic Molecule Analyzer (MOMA) experiment aboard the 2018 ExoMars mission will be the continuation of the search for organic matter on the Mars surface. One advancement with the ExoMars mission is that the sample will be extracted as deep as 2 meters below the Martian surface to minimize effects of radiation and oxidation on organic materials. To analyze the wide range of organic composition (volatile and non-volatile compounds) of the Martian soil, MOMA is equipped with a dual ion source ion trap mass spectrometer utilizing UV laser desorption / ionization (LDI) and pyrolysis gas chromatography (pyr-GC). In order to analyze refractory organic compounds and chiral molecules during GC-ITMS analysis, samples may be submitted to a derivatization process, consisting of the reaction of the sample components with specific reactants (MTBSTFA [1], DMF-DMA [2] or TMAH [3]). Previous experimental reports have focused on coupling campaigns between the breadboard versions of the GC, provided by the French team (LISA, LATMOS, CentraleSupelec), and the MS, provided by the US team (NASA-GSFC). This work focuses on the performance verification and optimization of the GC-ITMS experiment using the Engineering Test Unit (ETU) models which are representative of the form, fit and function of the flight instrument including a flight-like pyrolysis oven and tapping station providing by the German team (MPS). The results obtained demonstrate the current status of the end-to-end performance of the gas chromatography-mass spectrometry mode of operation. References: [1] Buch, A. et al. (2009) J Chrom. A, 43, 143-151. [2] Freissinet et al. (2011) J Chrom A, 1306, 59-71. [3] Geffroy-Rodier, C. et al. (2009) JAAP, 85, 454-459.

  16. In vivo laser assisted end-to-end anastomosis with ICG-infused chitosan patches

    NASA Astrophysics Data System (ADS)

    Rossi, Francesca; Matteini, Paolo; Esposito, Giuseppe; Scerrati, Alba; Albanese, Alessio; Puca, Alfredo; Maira, Giulio; Rossi, Giacomo; Pini, Roberto

    2011-07-01

    Laser assisted vascular repair is a new optimized technique based on the use of ICG-infused chitosan patch to close a vessel wound, with or even without few supporting single stitches. We present an in vivo experimental study on an innovative end-to-end laser assisted vascular anastomotic (LAVA) technique, performed with the application of ICGinfused chitosan patches. The photostability and the mechanical properties of ICG-infused chitosan films were preliminary measured. The in vivo study was performed in 10 New Zealand rabbits. After anesthesia, a 3-cm segment of the right common carotid artery was exposed, thus clamped proximally and distally. The artery was then interrupted by means of a full thickness cut. Three single microsutures were used to approximate the two vessel edges. The ICG-infused chitosan patch was rolled all over the anastomotic site and welded by the use of a diode laser emitting at 810 nm and equipped with a 300 μm diameter optical fiber. Welding was obtained by delivering single laser spots to induce local patch/tissue adhesion. The result was an immediate closure of the anastomosis, with no bleeding at clamps release. Thus animals underwent different follow-up periods, in order to evaluate the welded vessels over time. At follow-up examinations, all the anastomoses were patent and no bleeding signs were documented. Samples of welded vessels underwent histological examinations. Results showed that this technique offer several advantages over conventional suturing methods: simplification of the surgical procedure, shortening of the operative time, better re-endothelization and optimal vascular healing process.

  17. SME2EM: Smart mobile end-to-end monitoring architecture for life-long diseases.

    PubMed

    Serhani, Mohamed Adel; Menshawy, Mohamed El; Benharref, Abdelghani

    2016-01-01

    Monitoring life-long diseases requires continuous measurements and recording of physical vital signs. Most of these diseases are manifested through unexpected and non-uniform occurrences and behaviors. It is impractical to keep patients in hospitals, health-care institutions, or even at home for long periods of time. Monitoring solutions based on smartphones combined with mobile sensors and wireless communication technologies are a potential candidate to support complete mobility-freedom, not only for patients, but also for physicians. However, existing monitoring architectures based on smartphones and modern communication technologies are not suitable to address some challenging issues, such as intensive and big data, resource constraints, data integration, and context awareness in an integrated framework. This manuscript provides a novel mobile-based end-to-end architecture for live monitoring and visualization of life-long diseases. The proposed architecture provides smartness features to cope with continuous monitoring, data explosion, dynamic adaptation, unlimited mobility, and constrained devices resources. The integration of the architecture׳s components provides information about diseases׳ recurrences as soon as they occur to expedite taking necessary actions, and thus prevent severe consequences. Our architecture system is formally model-checked to automatically verify its correctness against designers׳ desirable properties at design time. Its components are fully implemented as Web services with respect to the SOA architecture to be easy to deploy and integrate, and supported by Cloud infrastructure and services to allow high scalability, availability of processes and data being stored and exchanged. The architecture׳s applicability is evaluated through concrete experimental scenarios on monitoring and visualizing states of epileptic diseases. The obtained theoretical and experimental results are very promising and efficiently satisfy the proposed

  18. SU-E-T-150: End to End Tests On the First Clinical EDGETM

    SciTech Connect

    Scheib, S; Schmelzer, P; Vieira, S; Greco, C

    2014-06-01

    Purpose: To quantify the sub millimeter overall accuracy of EDGETM, the dedicated linac based SRS/SABR treatment platform from Varian, using a novel End-to-End (E2E) test phantom. Methods: The new E2E test phantom developed by Varian consists of a cube with an outer dimension of 15x15x15 cm3. The phantom is equipped with an exchangable inner cube (7×7×7 cm3) to hold radiochromic films or a tungsten ball (diameter = 5 mm) for Winston-Lutz tests. 16 ceramic balls (diameter = 5 mm) are embedded in the outer cube. Three embedded Calypso transponders allow for Calypso based monitoring. The outer surface of the phantom is tracked using the Optical Surface Monitoring System (OSMS). The phantom is positioned using kV, MV and CBCT images. A simCT of the phantom was acquired and SRS/SABR plans were treated using the new phantom on the first clinical installed EDGETM. As a first step a series of EPID based Winston-Lutz tests have been performed. As a second step the calculated dose distribution applied to the phantom was verified with radiochromic films in orthogonal planes. The measured dose distribution is compared with the calculated (Eclipse) one based on the known isocenter on both dose distributions. The geometrical shift needed to match both dose distributions is the overall accuracy and is determined using dose profiles, isodose lines or gamma pass rates (3%, 1 mm). Results: Winston-Lutz tests using the central tungsten BB demonstrated a targeting accuracy of 0.44±0.18mm for jaw (2cm × 2cm) defined 0.39±0.19mm for MLC (2cm × 2cm) defined and 0.37±0.15mm for cone (12.5 mm) defined fields. A treated patient plan (spinal metastases lesion with integrated boost) showed a dosimetric dose localization accuracy of 0.6mm. Conclusion: Geometric and dosimetric E2E tests on EDGETM, show sub-millimeter E2E targeting and dose localisation accuracy.

  19. An End-to-End System to Enable Quick, Easy and Inexpensive Deployment of Hydrometeorological Stations

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2014-12-01

    The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.

  20. IDENTIFYING ELUSIVE ELECTROMAGNETIC COUNTERPARTS TO GRAVITATIONAL WAVE MERGERS: AN END-TO-END SIMULATION

    SciTech Connect

    Nissanke, Samaya; Georgieva, Alexandra; Kasliwal, Mansi

    2013-04-20

    Combined gravitational wave (GW) and electromagnetic (EM) observations of compact binary mergers should enable detailed studies of astrophysical processes in the strong-field gravity regime. This decade, ground-based GW interferometers promise to routinely detect compact binary mergers. Unfortunately, networks of GW interferometers have poor angular resolution on the sky and their EM signatures are predicted to be faint. Therefore, a challenging goal will be to unambiguously pinpoint the EM counterparts of GW mergers. We perform the first comprehensive end-to-end simulation that focuses on: (1) GW sky localization, distance measures, and volume errors with two compact binary populations and four different GW networks; (2) subsequent EM detectability by a slew of multiwavelength telescopes; and (3) final identification of the merger counterpart amidst a sea of possible astrophysical false positives. First, we find that double neutron star binary mergers can be detected out to a maximum distance of 400 Mpc (or 750 Mpc) by three (or five) detector GW networks, respectively. Neutron-star-black-hole binary mergers can be detected a factor of 1.5 further out; their median to maximum sky localizations are 50-170 deg{sup 2} (or 6-65 deg{sup 2}) for a three (or five) detector GW network. Second, by optimizing depth, cadence, and sky area, we quantify relative fractions of optical counterparts that are detectable by a suite of different aperture-size telescopes across the globe. Third, we present five case studies to illustrate the diversity of scenarios in secure identification of the EM counterpart. We discuss the case of a typical binary, neither beamed nor nearby, and the challenges associated with identifying an EM counterpart at both low and high Galactic latitudes. For the first time, we demonstrate how construction of low-latency GW volumes in conjunction with local universe galaxy catalogs can help solve the problem of false positives. We conclude with strategies

  1. Astra: Interdisciplinary study on enhancement of the end-to-end accuracy for spacecraft tracking techniques

    NASA Astrophysics Data System (ADS)

    Iess, Luciano; Di Benedetto, Mauro; James, Nick; Mercolino, Mattia; Simone, Lorenzo; Tortora, Paolo

    2014-02-01

    Navigation of deep-space probes is accomplished through a variety of different radio observables, namely Doppler, ranging and Delta-Differential One-Way Ranging (Delta-DOR). The particular mix of observations used for navigation mainly depends on the available on-board radio system, the mission phase and orbit determination requirements. The accuracy of current ESA and NASA tracking systems is at level of 0.1 mm/s at 60 s integration time for Doppler, 1-5 m for ranging and 6-15 nrad for Delta-DOR measurements in a wide range of operational conditions. The ASTRA study, funded under ESA's General Studies Programme (GSP), addresses the ways to improve the end-to-end accuracy of Doppler, ranging and Delta-DOR systems by roughly a factor of 10. The target accuracies were set to 0.01 mm/s at 60 s integration time for Doppler, 20 cm for ranging and 1 nrad for Delta-DOR. The companies and universities that took part in the study were the University of Rome Sapienza, ALMASpace, BAE Systems and Thales Alenia Space Italy. The analysis of an extensive data set of radio-metric observables and dedicated tests of the ground station allowed consolidating the error budget for each measurement technique. The radio-metric data set comprises X/X, X/Ka and Ka/Ka range and Doppler observables from the Cassini and Rosetta missions. It includes also measurements from the Advanced Media Calibration System (AMCS) developed by JPL for the radio science experiments of the Cassini mission. The error budget for the three radio-metric observables was consolidated by comparing the statistical properties of the data set with the expected error models. The analysis confirmed the contribution from some error sources, but revealed also some discrepancies and ultimately led to improved error models. The error budget reassessment provides adequate information for building guidelines and strategies to effectively improve the navigation accuracies of future deep space missions. We report both on updated

  2. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services Over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  3. Achieving End-to-End QoS in the Next Generation Internet: Integrated Services over Differentiated Service Networks

    NASA Technical Reports Server (NTRS)

    Bai, Haowei; Atiquzzaman, Mohammed; Ivancic, William

    2001-01-01

    Currently there are two approaches to provide Quality of Service (QoS) in the next generation Internet: An early one is the Integrated Services (IntServ) with the goal of allowing end-to-end QoS to be provided to applications; the other one is the Differentiated Services (DiffServ) architecture providing QoS in the backbone. In this context, a DiffServ network may be viewed as a network element in the total end-to-end path. The objective of this paper is to investigate the possibility of providing end-to-end QoS when IntServ runs over DiffServ backbone in the next generation Internet. Our results show that the QoS requirements of IntServ applications can be successfully achieved when IntServ traffic is mapped to the DiffServ domain in next generation Internet.

  4. End-to-End Self-Assembly of Semiconductor Nanorods in Water by Using an Amphiphilic Surface Design.

    PubMed

    Taniguchi, Yuki; Takishita, Takao; Kawai, Tsuyoshi; Nakashima, Takuya

    2016-02-01

    One-dimensional (1D) self-assemblies of nanocrystals are of interest because of their vectorial and polymer-like dynamic properties. Herein, we report a simple method to prepare elongated assemblies of semiconductor nanorods (NRs) through end-to-end self-assembly. Short-chained water-soluble thiols were employed as surface ligands for CdSe NRs having a wurtzite crystal structure. The site-specific capping of NRs with these ligands rendered the surface of the NRs amphiphilic. The amphiphilic CdSe NRs self-assembled to form elongated wires by end-to-end attachment driven by the hydrophobic effect operating between uncapped NR ends. The end-to-end assembly technique was further applied to CdS NRs and CdSe tetrapods (TPs) with a wurtzite structure. PMID:26836341

  5. Common Patterns with End-to-end Interoperability for Data Access

    NASA Astrophysics Data System (ADS)

    Gallagher, J.; Potter, N.; Jones, M. B.

    2010-12-01

    At first glance, using common storage formats and open standards should be enough to ensure interoperability between data servers and client applications, but that is often not the case. In the REAP (Realtime Environment for Analytical Processing; NSF #0619060) project we integrated access to data from OPeNDAP servers into the Kepler workflow system and found that, as in previous cases, we spent the bulk of our effort addressing the twin issues of data model compatibility and integration strategies. Implementing seamless data access between a remote data source and a client application (data sink) can be broken down into two kinds of issues. First, the solution must address any differences in the data models used by the data source (OPeNDAP) and the data sink (the Kepler workflow system). If these models match completely, there is little work to be done. However, that is rarely the case. To map OPeNDAP's data model to Kepler's, we used two techniques (ignoring trivial conversions): On-the-fly type mapping and out-of-band communication. Type conversion takes place both for data and metadata because Kepler requires a priori knowledge of some aspects (e.g., syntactic metadata) of the data to build a workflow. In addition, OPeNDAP's constraint expression syntax was used to send out-of-band information to restrict the data requested from the server, facilitating changes in the returned data's type. This technique provides a way for users to exert fine-grained control over the data request, a potentially useful technique, at the cost of requiring that users understand a little about the data source's processing capabilities. The second set of issues for end-to-end data access are integration strategies. OPeNDAP provides several different tools for bringing data into an application: C++, C and Java libraries that provide functions for newly written software; The netCDF library which enables existing applications to read from servers using an older interface; and simple

  6. Unidata's Vision for Providing Comprehensive and End-to-end Data Services

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2009-05-01

    This paper presents Unidata's vision for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users no matter where they are or how they are connected to the Internetwill be able to find and access a plethora of geosciences data and use Unidata-provided tools and services both productively and creatively in their research and education. What that vision means for the Unidata community is elucidated by drawing a simple analogy. Most of users are familiar with Amazon and eBay e-commerce sites and content sharing sites like YouTube and Flickr. On the eBay marketplace, people can sell practically anything at any time and buyers can share their experience of purchasing a product or the reputation of a seller. Likewise, at Amazon, thousands of merchants sell their goods and millions of customers not only buy those goods, but provide a review or opinion of the products they buy and share their experiences as purchasers. Similarly, YouTube and Flickr are sites tailored to video- and photo-sharing, respectively, where users can upload their own content and share it with millions of other users, including family and friends. What all these sites, together with social-networking applications like MySpace and Facebook, have enabled is a sense of a virtual community in which users can search and browse products or content, comment and rate those products from anywhere, at any time, and via any Internet- enabled device like an iPhone, laptop, or a desktop computer. In essence, these enterprises have fundamentally altered people's buying modes and behavior toward purchases. Unidata believes that similar approaches, appropriately tailored to meet the needs of the scientific

  7. A vision for end-to-end data services to foster international partnerships through data sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  8. SensorKit: An End-to-End Solution for Environmental Sensor Networking

    NASA Astrophysics Data System (ADS)

    Silva, F.; Graham, E.; Deschon, A.; Lam, Y.; Goldman, J.; Wroclawski, J.; Kaiser, W.; Benzel, T.

    2008-12-01

    Modern day sensor network technology has shown great promise to transform environmental data collection. However, despite the promise, these systems have remained the purview of the engineers and computer scientists who design them rather than a useful tool for the environmental scientists who need them. SensorKit is conceived of as a way to make wireless sensor networks accessible to The People: it is an advanced, powerful tool for sensor data collection that does not require advanced technological know-how. We are aiming to make wireless sensor networks for environmental science as simple as setting up a standard home computer network by providing simple, tested configurations of commercially-available hardware, free and easy-to-use software, and step-by-step tutorials. We designed and built SensorKit using a simplicity-through-sophistication approach, supplying users a powerful sensor to database end-to-end system with a simple and intuitive user interface. Our objective in building SensorKit was to make the prospect of using environmental sensor networks as simple as possible. We built SensorKit from off the shelf hardware components, using the Compact RIO platform from National Instruments for data acquisition due to its modular architecture and flexibility to support a large number of sensor types. In SensorKit, we support various types of analog, digital and networked sensors. Our modular software architecture allows us to abstract sensor details and provide users a common way to acquire data and to command different types of sensors. SensorKit is built on top of the Sensor Processing and Acquisition Network (SPAN), a modular framework for acquiring data in the field, moving it reliably to the scientist institution, and storing it in an easily-accessible database. SPAN allows real-time access to the data in the field by providing various options for long haul communication, such as cellular and satellite links. Our system also features reliable data storage

  9. On the importance of risk knowledge for an end-to-end tsunami early warning system

    NASA Astrophysics Data System (ADS)

    Post, Joachim; Strunz, Günter; Riedlinger, Torsten; Mück, Matthias; Wegscheider, Stephanie; Zosseder, Kai; Steinmetz, Tilmann; Gebert, Niklas; Anwar, Herryal

    2010-05-01

    context has been worked out. The generated results contribute significantly in the fields of (1) warning decision and warning levels, (2) warning dissemination and warning message content, (3) early warning chain planning, (4) increasing response capabilities and protective systems, (5) emergency relief and (6) enhancing communities' awareness and preparedness towards tsunami threats. Additionally examples will be given on the potentials of an operational use of risk information in early warning systems as first experiences exist for the tsunami early warning center in Jakarta, Indonesia. Beside this the importance of linking national level early warning information with tsunami risk information available at the local level (e.g. linking warning message information on expected intensity with respective tsunami hazard zone maps at community level for effective evacuation) will be demonstrated through experiences gained in three pilot areas in Indonesia. The presentation seeks to provide new insights on benefits using risk information in early warning and will provide further evidence that practical use of risk information is an important and indispensable component of end-to-end early warning.

  10. End-to-end delay reduction in narrow bandwidth real-time multimedia tele-education applications

    NASA Astrophysics Data System (ADS)

    Algra, Theo

    1993-02-01

    The transmission of multimedia page sequences in real time tele-education presentations via narrowband links results in unacceptable end to end delays. A time shifting method referred to as pretransfer which transfers presentation data in background during the session, without user involvement is proposed. Point to point and multipoint protocols are discussed. For multicast situations an effective page scheduling method is developed.

  11. Integration proposal through standard-based design of an end-to-end platform for p-Health environments.

    PubMed

    Martíínez, I; Trigo, J D; Martínez-Espronceda, M; Escayola, J; Muñoz, P; Serrano, L; García, J

    2009-01-01

    Interoperability among medical devices and compute engines in the personal environment of the patient, and with healthcare information systems in the remote monitoring and management process is a key need that requires developments supported on standard-based design. Even though there have been some international initiatives to combine different standards, the vision of an entire end-to-end standard-based system is the next challenge. This paper presents the implementation guidelines of a ubiquitous platform for Personal Health (p-Health). It is standard-based using the two main medical norms in this context: ISO/IEEE11073 in the patient environment for medical device interoperability, and EN13606 to allow the interoperable communication of the Electronic Healthcare Record of the patient. Furthermore, the proposal of a new protocol for End-to-End Standard Harmonization (E2ESHP) is presented in order to make possible the end-to-end standard integration. The platform has been designed to comply with the last ISO/IEEE11073 and EN13606 available versions, and tested in a laboratory environment as a proof-of-concept to illustrate its feasibility as an end-to-end standard-based solution.

  12. Wiener restoration of sampled image data - End-to-end analysis

    NASA Technical Reports Server (NTRS)

    Fales, Carl L.; Huck, Friedrich O.; Mccormick, Judith A.; Park, Stephen K.

    1988-01-01

    The Wiener filter is formulated as a function of the basic image-gathering and image-reconstruction constraints, thereby providing a method for minimizing the mean-squared error between the (continuous-input) radiance field and its restored (continuous-output) representation. This formulation of the Wiener filter is further extended to the Wiener-characteristic filter, which provides a method for explicitly specifying the desired representation. Two specific examples of Wiener filters are presented.

  13. A vision for end-to-end data services to foster international partnerships through data sharing

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.; Yoksas, T.

    2009-04-01

    Increasingly, the conduct of science requires scientific partnerships and sharing of knowledge, information, and other assets. This is particularly true in our field where the highly-coupled Earth system and its many linkages have heightened the importance of collaborations across geographic, disciplinary, and organizational boundaries. The climate system, for example, is far too complex a puzzle to be unraveled by individual investigators or nations. As articulated in the NSF Strategic Plan: FY 2006-2011, "…discovery increasingly requires expertise of individuals from different disciplines, with diverse perspectives, and often from different nations, working together to accommodate the extraordinary complexity of today's science and engineering challenges." The Nobel Prize winning IPCC assessments are a prime example of such an effort. Earth science education is also uniquely suited to drawing connections between the dynamic Earth system and societal issues. Events like the 2004 Indian Ocean tsunami and Hurricane Katrina provide ample evidence of this relevance, as they underscore the importance of timely and interdisciplinary integration and synthesis of data. Our success in addressing such complex problems and advancing geosciences depends on the availability of a state-of-the-art and robust cyberinfrastructure, transparent and timely access to high-quality data from diverse sources, and requisite tools to integrate and use the data effectively, toward creating new knowledge. To that end, Unidata's vision calls for providing comprehensive, well-integrated, and end-to-end data services for the geosciences. These include an array of functions for collecting, finding, and accessing data; data management tools for generating, cataloging, and exchanging metadata; and submitting or publishing, sharing, analyzing, visualizing, and integrating data. When this vision is realized, users — no matter where they are, how they are connected to the Internet, or what

  14. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  15. End-to-End Demonstrator of the Safe Affordable Fission Engine (SAFE) 30: Power Conversion and Ion Engine Operation

    NASA Technical Reports Server (NTRS)

    Hrbud, Ivana; VanDyke, Melissa; Houts, Mike; Goodfellow, Keith; Schafer, Charles (Technical Monitor)

    2001-01-01

    The Safe Affordable Fission Engine (SAFE) test series addresses Phase 1 Space Fission Systems issues in particular non-nuclear testing and system integration issues leading to the testing and non-nuclear demonstration of a 400-kW fully integrated flight unit. The first part of the SAFE 30 test series demonstrated operation of the simulated nuclear core and heat pipe system. Experimental data acquired in a number of different test scenarios will validate existing computational models, demonstrated system flexibility (fast start-ups, multiple start-ups/shut downs), simulate predictable failure modes and operating environments. The objective of the second part is to demonstrate an integrated propulsion system consisting of a core, conversion system and a thruster where the system converts thermal heat into jet power. This end-to-end system demonstration sets a precedent for ground testing of nuclear electric propulsion systems. The paper describes the SAFE 30 end-to-end system demonstration and its subsystems.

  16. End-to-end delay reduction in narrow bandwidth real-time multimedia tele-education applications

    NASA Astrophysics Data System (ADS)

    Algra, Theo

    1993-10-01

    The transmission of multimedia page sequences in real-time tele-education presentations via narrowband links results in unacceptable end-to-end delays. This paper proposes a time- shifting method referred to as pretransfer which transfers presentation data in background during the session, without user involvement. Point-to-point and multipoint protocols are discussed. For multicast situations an effective page-scheduling method is developed.

  17. Minimizing End-to-End Interference in I/O Stacks Spanning Shared Multi-Level Buffer Caches

    ERIC Educational Resources Information Center

    Patrick, Christina M.

    2011-01-01

    This thesis presents an end-to-end interference minimizing uniquely designed high performance I/O stack that spans multi-level shared buffer cache hierarchies accessing shared I/O servers to deliver a seamless high performance I/O stack. In this thesis, I show that I can build a superior I/O stack which minimizes the inter-application interference…

  18. End-to-end testing. [to verify electrical equipment failure due to carbon fibers released in aircraft-fuel fires

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1979-01-01

    The principle objective of the kinds of demonstration tests that are discussed is to try to verify whether or not carbon fibers that are released by burning composite parts in an aircraft-fuel fires can produce failures in electrical equipment. A secondary objective discussed is to experimentally validate the analytical models for some of the key elements in the risk analysis. The approach to this demonstration testing is twofold: limited end-to-end test are to be conducted in a shock tube; and planning for some large outdoor burn tests is being done.

  19. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Cusp

    NASA Technical Reports Server (NTRS)

    Coffey, V. N.; Chandler, M. O.; Singh, Nagendra; Avanov, Levon

    2005-01-01

    This paper describes a study of the effects of unstable magnetosheath distributions on the cusp ionosphere. An end-to-end numerical model was used to study, first, the evolved distributions from precipitation due to reconnection and, secondly, the energy transfer into the high latitude ionosphere based on these solar wind/magnetosheath inputs. Using inputs of several representative examples of magnetosheath injections, waves were generated at the lower hybrid frequency and energy transferred to the ionospheric electrons and ions. The resulting wave spectra and ion and electron particle heating was analyzed. Keywords: Ion heating: Magnetosheath/Ionosphere coupling: Particle/Wave Interactions. Simulations

  20. The Kepler End-to-End Model: Creating High-Fidelity Simulations to Test Kepler Ground Processing

    NASA Technical Reports Server (NTRS)

    Bryson, Stephen T.; Jenkins, Jon M.; Peters, Dan J.; Tenenbaum, Peter P.; Klaus, Todd C.; Gunter, Jay P.; Cote, Miles T.; Caldwell, Douglas A.

    2010-01-01

    The Kepler mission is designed to detect the transit of Earth-like planets around Sun-like stars by observing 100,000 stellar targets. Developing and testing the Kepler ground-segment processing system, in particular the data analysis pipeline, requires high-fidelity simulated data. This simulated data is provided by the Kepler End-to-End Model (ETEM). ETEM simulates the astrophysics of planetary transits and other phenomena, properties of the Kepler spacecraft and the format of the downlinked data. Major challenges addressed by ETEM include the rapid production of large amounts of simulated data, extensibility and maintainability.

  1. Effect of swirling flow on platelet concentration distribution in small-caliber artificial grafts and end-to-end anastomoses

    NASA Astrophysics Data System (ADS)

    Zhan, Fan; Fan, Yu-Bo; Deng, Xiao-Yan

    2011-10-01

    Platelet concentration near the blood vessel wall is one of the major factors in the adhesion of platelets to the wall. In our previous studies, it was found that swirling flows could suppress platelet adhesion in small-caliber artificial grafts and end-to-end anastomoses. In order to better understand the beneficial effect of the swirling flow, we numerically analyzed the near-wall concentration distribution of platelets in a straight tube and a sudden tubular expansion tube under both swirling flow and normal flow conditions. The numerical models were created based on our previous experimental studies. The simulation results revealed that when compared with the normal flow, the swirling flow could significantly reduce the near-wall concentration of platelets in both the straight tube and the expansion tube. The present numerical study therefore indicates that the reduction in platelet adhesion under swirling flow conditions in small-caliber arterial grafts, or in end-to-end anastomoses as observed in our previous experimental study, was possibly through a mechanism of platelet transport, in which the swirling flow reduced the near-wall concentration of platelets.

  2. Self-assembled nanogaps via seed-mediated growth of end-to-end linked gold nanorods.

    PubMed

    Jain, Titoo; Westerlund, Fredrik; Johnson, Erik; Moth-Poulsen, Kasper; Bjørnholm, Thomas

    2009-04-28

    Gold nanorods (AuNRs) are of interest for a wide range of applications, ranging from imaging to molecular electronics, and they have been studied extensively for the past decade. An important issue in AuNR applications is the ability to self-assemble the rods in predictable structures on the nanoscale. We here present a new way to end-to-end link AuNRs with a single or few linker molecules. Whereas methods reported in the literature so far rely on modification of the AuNRs after the synthesis, we here dimerize gold nanoparticle seeds with a water-soluble dithiol-functionalized polyethylene glycol linker and expose the linked seeds to growth conditions identical to the synthesis of unlinked AuNRs. Doing so, we obtain a large fraction of end-to-end linked rods, and transmission electron microscopy provides evidence of a 1-2 nm wide gap between the AuNRs. Flow linear dichroism demonstrates that a large fraction of the rods are flexible around the hinging molecule in solution, as expected for a molecularly linked nanogap. By using excess of gold nanoparticles relative to the linking dithiol molecule, this method can provide a high probability that a single molecule is connecting the two rods. In essence, our methods hence demonstrate the fabrication of a nanostructure with a molecule connected to two nanoelectrodes by bottom-up chemical assembly.

  3. Image gathering, coding, and processing: End-to-end optimization for efficient and robust acquisition of visual information

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.

    1990-01-01

    Researchers are concerned with the end-to-end performance of image gathering, coding, and processing. The applications range from high-resolution television to vision-based robotics, wherever the resolution, efficiency and robustness of visual information acquisition and processing are critical. For the presentation at this workshop, it is convenient to divide research activities into the following two overlapping areas: The first is the development of focal-plane processing techniques and technology to effectively combine image gathering with coding, with an emphasis on low-level vision processing akin to the retinal processing in human vision. The approach includes the familiar Laplacian pyramid, the new intensity-dependent spatial summation, and parallel sensing/processing networks. Three-dimensional image gathering is attained by combining laser ranging with sensor-array imaging. The second is the rigorous extension of information theory and optimal filtering to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing.

  4. Performances of the fractal iterative method with an internal model control law on the ESO end-to-end ELT adaptive optics simulator

    NASA Astrophysics Data System (ADS)

    Béchet, C.; Le Louarn, M.; Tallon, M.; Thiébaut, É.

    2008-07-01

    Adaptive Optics systems under study for the Extremely Large Telescopes gave rise to a new generation of algorithms for both wavefront reconstruction and the control law. In the first place, the large number of controlled actuators impose the use of computationally efficient methods. Secondly, the performance criterion is no longer solely based on nulling residual measurements. Priors on turbulence must be inserted. In order to satisfy these two requirements, we suggested to associate the Fractal Iterative Method for the estimation step with an Internal Model Control. This combination has now been tested on an end-to-end adaptive optics numerical simulator at ESO, named Octopus. Results are presented here and performance of our method is compared to the classical Matrix-Vector Multiplication combined with a pure integrator. In the light of a theoretical analysis of our control algorithm, we investigate the influence of several errors contributions on our simulations. The reconstruction error varies with the signal-to-noise ratio but is limited by the use of priors. The ratio between the system loop delay and the wavefront coherence time also impacts on the reachable Strehl ratio. Whereas no instabilities are observed, correction quality is obviously affected at low flux, when subapertures extinctions are frequent. Last but not least, the simulations have demonstrated the robustness of the method with respect to sensor modeling errors and actuators misalignments.

  5. End-To-End Risk Assesment: From Genes and Protein to Acceptable Radiation Risks for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Schimmerling, Walter

    2000-01-01

    The human exploration of Mars will impose unavoidable health risks from galactic cosmic rays (GCR) and possibly solar particle events (SPE). It is the goal of NASA's Space Radiation Health Program to develop the capability to predict health risks with significant accuracy to ensure that risks are well below acceptable levels and to allow for mitigation approaches to be effective at reasonable costs. End-to-End risk assessment is the approach being followed to understand proton and heavy ion damage at the molecular, cellular, and tissue levels in order to predict the probability of the major health risk including cancer, neurological disorders, hereditary effects, cataracts, and acute radiation sickness and to develop countermeasures for mitigating risks.

  6. HITSZ_CDR: an end-to-end chemical and disease relation extraction system for BioCreative V

    PubMed Central

    Li, Haodi; Tang, Buzhou; Chen, Qingcai; Chen, Kai; Wang, Xiaolong; Wang, Baohua; Wang, Zhe

    2016-01-01

    In this article, an end-to-end system was proposed for the challenge task of disease named entity recognition (DNER) and chemical-induced disease (CID) relation extraction in BioCreative V, where DNER includes disease mention recognition (DMR) and normalization (DN). Evaluation on the challenge corpus showed that our system achieved the highest F1-scores 86.93% on DMR, 84.11% on DN, 43.04% on CID relation extraction, respectively. The F1-score on DMR is higher than our previous one reported by the challenge organizers (86.76%), the highest F1-score of the challenge. Database URL: http://database.oxfordjournals.org/content/2016/baw077 PMID:27270713

  7. The MARS pathfinder end-to-end information system: A pathfinder for the development of future NASA planetary missions

    NASA Technical Reports Server (NTRS)

    Cook, Richard A.; Kazz, Greg J.; Tai, Wallace S.

    1996-01-01

    The development of the Mars pathfinder is considered with emphasis on the End-to-End Information System (EEIS) development approach. The primary mission objective is to successfully develop and deliver a single flight system to the Martian surface, demonstrating entry, descent and landing. The EEIS is a set of functions distributed throughout the flight, ground and Mission Operation Systems (MOS) that inter-operate in order to control, collect, transport, process, store and analyze the uplink and downlink information flows of the mission. Coherence between the mission systems is achieved though the EEIS architecture. The key characteristics of the system are: a concurrent engineering approach for the development of flight, ground and mission operation systems; the fundamental EEIS architectural heuristics; a phased incremental EEIS development and test approach, and an EEIS design deploying flight, ground and MOS operability features, including integrated ground and flight based toolsets.

  8. End-To-End Risk Assesment: From Genes and Protein to Acceptable Radiation Risks for Mars Exploration

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; Schimmerling, Walter

    2000-07-01

    The human exploration of Mars will impose unavoidable health risks from galactic cosmic rays (GCR) and possibly solar particle events (SPE). It is the goal of NASA's Space Radiation Health Program to develop the capability to predict health risks with significant accuracy to ensure that risks are well below acceptable levels and to allow for mitigation approaches to be effective at reasonable costs. End-to-End risk assessment is the approach being followed to understand proton and heavy ion damage at the molecular, cellular, and tissue levels in order to predict the probability of the major health risk including cancer, neurological disorders, hereditary effects, cataracts, and acute radiation sickness and to develop countermeasures for mitigating risks.

  9. NASA End-to-End Data System /NEEDS/ information adaptive system - Performing image processing onboard the spacecraft

    NASA Technical Reports Server (NTRS)

    Kelly, W. L.; Howle, W. M.; Meredith, B. D.

    1980-01-01

    The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.

  10. High fundamental spatial frequencies and edges have different perceptual consequences in the 'group/end-to-end' movement phenomenon.

    PubMed

    Petersik, J T; Grassmuck, J

    1981-01-01

    A subject viewing two alternating frames, each containing, say, three vertical stripes in a horizontal row, displaced laterally by one cycle in one frame with respect to the others, perceives either the three stripes moving left-right-left in unison (group movement) or one stripe moving from one end of the display to the other and the two overlapping stripes stationary (end-to-end movement). At suitable temporal parameters of presentation (frame duration, interstimulus interval) the perception of the display is bistable. Experiments have shown that the relative strengths of these alternative movement sensations depend upon the fundamental spatial frequency of the display and upon stimulus waveform. Square-wave stimuli, which have energy at high spatial frequencies, had effects opposite to those produced by increases in fundamental spatial frequency. Amblyopes differed from normal viewers only in the perception of the square-wave stimuli. PMID:7335436

  11. End-to-End Trajectory for Conjunction Class Mars Missions Using Hybrid Solar-Electric/Chemical Transportation System

    NASA Technical Reports Server (NTRS)

    Chai, Patrick R.; Merrill, Raymond G.; Qu, Min

    2016-01-01

    NASA's Human Spaceflight Architecture Team is developing a reusable hybrid transportation architecture in which both chemical and solar-electric propulsion systems are used to deliver crew and cargo to exploration destinations. By combining chemical and solar-electric propulsion into a single spacecraft and applying each where it is most effective, the hybrid architecture enables a series of Mars trajectories that are more fuel efficient than an all chemical propulsion architecture without significant increases to trip time. The architecture calls for the aggregation of exploration assets in cislunar space prior to departure for Mars and utilizes high energy lunar-distant high Earth orbits for the final staging prior to departure. This paper presents the detailed analysis of various cislunar operations for the EMC Hybrid architecture as well as the result of the higher fidelity end-to-end trajectory analysis to understand the implications of the design choices on the Mars exploration campaign.

  12. Development of the 4D Phantom for patient-specific, end-to-end radiation therapy QA

    NASA Astrophysics Data System (ADS)

    Malinowski, K.; Noel, C.; Lu, W.; Lechleiter, K.; Hubenschmidt, J.; Low, D.; Parikh, P.

    2007-03-01

    In many patients respiratory motion causes motion artifacts in CT images, thereby inhibiting precise treatment planning and lowering the ability to target radiation to tumors. The 4D Phantom, which includes a 3D stage and a 1D stage that each are capable of arbitrary motion and timing, was developed to serve as an end-to-end radiation therapy QA device that could be used throughout CT imaging, radiation therapy treatment planning, and radiation therapy delivery. The dynamic accuracy of the system was measured with a camera system. The positional error was found to be equally likely to occur in the positive and negative directions for each axis, and the stage was within 0.1 mm of the desired position 85% of the time. In an experiment designed to use the 4D Phantom's encoders to measure trial-to-trial precision of the system, the 4D Phantom reproduced the motion during variable bag ventilation of a transponder that had been bronchoscopically implanted in a canine lung. In this case, the encoder readout indicated that the stage was within 10 microns of the sent position 94% of the time and that the RMS error was 7 microns. Motion artifacts were clearly visible in 3D and respiratory-correlated (4D) CT scans of phantoms reproducing tissue motion. In 4D CT scans, apparent volume was found to be directly correlated to instantaneous velocity. The system is capable of reproducing individual patient-specific tissue trajectories with a high degree of accuracy and precision and will be useful for end-to-end radiation therapy QA.

  13. Evaluation of Techniques to Detect Significant Network Performance Problems using End-to-End Active Network Measurements

    SciTech Connect

    Cottrell, R.Les; Logg, Connie; Chhaparia, Mahesh; Grigoriev, Maxim; Haro, Felipe; Nazir, Fawad; Sandford, Mark

    2006-01-25

    End-to-End fault and performance problems detection in wide area production networks is becoming increasingly hard as the complexity of the paths, the diversity of the performance, and dependency on the network increase. Several monitoring infrastructures are built to monitor different network metrics and collect monitoring information from thousands of hosts around the globe. Typically there are hundreds to thousands of time-series plots of network metrics which need to be looked at to identify network performance problems or anomalous variations in the traffic. Furthermore, most commercial products rely on a comparison with user configured static thresholds and often require access to SNMP-MIB information, to which a typical end-user does not usually have access. In our paper we propose new techniques to detect network performance problems proactively in close to realtime and we do not rely on static thresholds and SNMP-MIB information. We describe and compare the use of several different algorithms that we have implemented to detect persistent network problems using anomalous variations analysis in real end-to-end Internet performance measurements. We also provide methods and/or guidance for how to set the user settable parameters. The measurements are based on active probes running on 40 production network paths with bottlenecks varying from 0.5Mbits/s to 1000Mbit/s. For well behaved data (no missed measurements and no very large outliers) with small seasonal changes most algorithms identify similar events. We compare the algorithms' robustness with respect to false positives and missed events especially when there are large seasonal effects in the data. Our proposed techniques cover a wide variety of network paths and traffic patterns. We also discuss the applicability of the algorithms in terms of their intuitiveness, their speed of execution as implemented, and areas of applicability. Our encouraging results compare and evaluate the accuracy of our detection

  14. End-to-End Information System design at the NASA Jet Propulsion Laboratory. [data transmission between user and space-based sensor

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1978-01-01

    In recognition of a pressing need of the 1980s to optimize the two-way flow of information between a ground-based user and a remote-space-based sensor, an end-to-end approach to the design of information systems has been adopted at the JPL. This paper reviews End-to-End Information System (EEIS) activity at the JPL, with attention given to the scope of the EEIS transfer function, and functional and physical elements of the EEIS. The relationship between the EEIS and the NASA End-to-End Data System program is discussed.

  15. Bayesian optimal reconstruction of the primordial power spectrum

    NASA Astrophysics Data System (ADS)

    Bridges, M.; Feroz, F.; Hobson, M. P.; Lasenby, A. N.

    2009-12-01

    The form of the primordial power spectrum has the potential to differentiate strongly between competing models of perturbation generation in the early universe and so is of considerable importance. The recent release of five years of Wilkinson Microwave Anisotropy Probe observations have confirmed the general picture of the primordial power spectrum as deviating slightly from scale invariance with a spectral tilt parameter of ns ~ 0.96. None the less, many attempts have been made to isolate further features such as breaks and cut-offs using a variety of methods, some employing more than ~10 varying parameters. In this paper, we apply the robust technique of the Bayesian model selection to reconstruct the optimal degree of structure in the spectrum. We model the spectrum simply and generically as piecewise linear in lnk between `nodes' in k space whose amplitudes are allowed to vary. The number of nodes and their k-space positions are chosen by the Bayesian evidence so that we can identify both the complexity and location of any detected features. Our optimal reconstruction contains, perhaps, surprisingly few features, the data preferring just three nodes. This reconstruction allows for a degree of scale dependence of the tilt with the `turn-over' scale occurring around k ~ 0.016 Mpc-1. More structure is penalized by the evidence as overfitting the data, so there is currently little point in attempting reconstructions that are more complex.

  16. Profiling wind and greenhouse gases by infrared-laser occultation: results from end-to-end simulations in windy air

    NASA Astrophysics Data System (ADS)

    Plach, A.; Proschek, V.; Kirchengast, G.

    2015-07-01

    The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO) is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs), and line-of-sight (l.o.s.) wind speed with focus on the upper troposphere and lower stratosphere (UTLS). While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting. Here we use a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both stand-alone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm-1 and exploits transmission differences from a wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s. wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s-1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to the decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s-1 but is found to benefit in the case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and implementation of a LMIO mission.

  17. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: Unidata's Plans and Directions

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M.

    2005-12-01

    work together in a fundamentally different way. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational drivers and discuss recent developments in cyberinfrastructure and Unidata's role and directions in providing robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  18. End-to-end Cyberinfrastructure and Data Services for Earth System Science Education and Research: A vision for the future

    NASA Astrophysics Data System (ADS)

    Ramamurthy, M. K.

    2006-05-01

    yet revolutionary way of building applications and methods to connect and exchange information over the Web. This new approach, based on XML - a widely accepted format for exchanging data and corresponding semantics over the Internet - enables applications, computer systems, and information processes to work together in fundamentally different ways. Likewise, the advent of digital libraries, grid computing platforms, interoperable frameworks, standards and protocols, open-source software, and community atmospheric models have been important drivers in shaping the use of a new generation of end-to-end cyberinfrastructure for solving some of the most challenging scientific and educational problems. In this talk, I will present an overview of the scientific, technological, and educational landscape, discuss recent developments in cyberinfrastructure, and Unidata's role in and vision for providing easy-to use, robust, end-to-end data services for solving geoscientific problems and advancing student learning.

  19. Hardware and Methods of the Optical End-to-End Test of the Far Ultraviolet Spectroscopic Explorer (FUSE)

    NASA Technical Reports Server (NTRS)

    Conard, Steven J.; Redman, Kevin W.; Barkhouser, Robert H.; McGuffey, Doug B.; Smee, Stephen; Ohl, Raymond G.; Kushner, Gary

    1999-01-01

    The Far Ultraviolet Spectroscopic Explorer (FUSE), currently being tested and scheduled for a 1999 launch, is an astrophysics satellite designed to provide high spectral resolving power (Lambda/(Delta)Lambda = 24,000-30,000) over the interval 90.5-118.7 nm. The FUSE optical path consists of four co-aligned, normal incidence, off-axis parabolic, primary mirrors which illuminate separate Rowland circle spectrograph channels equipped with holographic gratings and delay line microchannel plate detectors. We describe the hardware and methods used for the optical end-to-end test of the FUSE instrument during satellite integration and test. Cost and schedule constraints forced us to devise a simplified version of the planned optical test which occurred in parallel with satellite thermal-vacuum testing. The optical test employed a collimator assembly which consisted of four co-aligned, 15" Cassegrain telescopes which were positioned above the FUSE instrument, providing a collimated beam for each optical channel. A windowed UV light source, remotely adjustable in three axes, was mounted at the focal plane of each collimator. Problems with the UV light sources, including high F-number and window failures, were the only major difficulties encountered during the test. The test succeeded in uncovering a significant problem with the secondary structure used for the instrument closeout cavity and, furthermore, showed that the mechanical solution was successful. The hardware was also used extensively for simulations of science observations, providing both UV light for spectra and visible light for the fine error sensor camera.

  20. An anthropomorphic multimodality (CT/MRI) head phantom prototype for end-to-end tests in ion radiotherapy.

    PubMed

    Gallas, Raya R; Hünemohr, Nora; Runz, Armin; Niebuhr, Nina I; Jäkel, Oliver; Greilich, Steffen

    2015-12-01

    With the increasing complexity of external beam therapy "end-to-end" tests are intended to cover every step from therapy planning through to follow-up in order to fulfill the higher demands on quality assurance. As magnetic resonance imaging (MRI) has become an important part of the treatment process, established phantoms such as the Alderson head cannot fully be used for those tests and novel phantoms have to be developed. Here, we present a feasibility study of a customizable multimodality head phantom. It is initially intended for ion radiotherapy but may also be used in photon therapy. As basis for the anthropomorphic head shape we have used a set of patient computed tomography (CT) images. The phantom recipient consisting of epoxy resin was produced by using a 3D printer. It includes a nasal air cavity, a cranial bone surrogate (based on dipotassium phosphate), a brain surrogate (based on agarose gel), and a surrogate for cerebrospinal fluid (based on distilled water). Furthermore, a volume filled with normoxic dosimetric gel mimicked a tumor. The entire workflow of a proton therapy could be successfully applied to the phantom. CT measurements revealed CT numbers agreeing with reference values for all surrogates in the range from 2 HU to 978 HU (120 kV). MRI showed the desired contrasts between the different phantom materials especially in T2-weighted images (except for the bone surrogate). T2-weighted readout of the polymerization gel dosimeter allowed approximate range verification.

  1. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    NASA Astrophysics Data System (ADS)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  2. End-to-end simulation of high-contrast imaging systems: methods and results for the PICTURE mission family

    NASA Astrophysics Data System (ADS)

    Douglas, Ewan S.; Hewasawam, Kuravi; Mendillo, Christopher B.; Cahoy, Kerri L.; Cook, Timothy A.; Finn, Susanna C.; Howe, Glenn A.; Kuchner, Marc J.; Lewis, Nikole K.; Marinan, Anne D.; Mawet, Dimitri; Chakrabarti, Supriya

    2015-09-01

    We describe a set of numerical approaches to modeling the performance of space flight high-contrast imaging payloads. Mission design for high-contrast imaging requires numerical wavefront error propagation to ensure accurate component specifications. For constructed instruments, wavelength and angle-dependent throughput and contrast models allow detailed simulations of science observations, allowing mission planners to select the most productive science targets. The PICTURE family of missions seek to quantify the optical brightness of scattered light from extrasolar debris disks via several high-contrast imaging techniques: sounding rocket (the Planet Imaging Concept Testbed Using a Rocket Experiment) and balloon flights of a visible nulling coronagraph, as well as a balloon flight of a vector vortex coronagraph (the Planetary Imaging Concept Testbed Using a Recoverable Experiment - Coronagraph, PICTURE-C). The rocket mission employs an on-axis 0.5m Gregorian telescope, while the balloon flights will share an unobstructed off-axis 0.6m Gregorian. This work details the flexible approach to polychromatic, end-to-end physical optics simulations used for both the balloon vector vortex coronagraph and rocket visible nulling coronagraph missions. We show the preliminary PICTURE-C telescope and vector vortex coronagraph design will achieve 10-8 contrast without post-processing as limited by realistic optics, but not considering polarization or low-order errors. Simulated science observations of the predicted warm ring around Epsilon Eridani illustrate the performance of both missions.

  3. End-to-end gene fusions and their impact on the production of multifunctional biomass degrading enzymes.

    PubMed

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2012-11-01

    The reduction of fossil fuels, coupled with its increase in price, has made the search for alternative energy resources more plausible. One of the topics gaining fast interest is the utilization of lignocellulose, the main component of plants. Its primary constituents, cellulose and hemicellulose, can be degraded by a series of enzymes present in microorganisms, into simple sugars, later used for bioethanol production. Thermophilic bacteria have proven to be an interesting source of enzymes required for hydrolysis since they can withstand high and denaturing temperatures, which are usually required for processes involving biomass degradation. However, the cost associated with the whole enzymatic process is staggering. A solution for cost effective and highly active production is through the construction of multifunctional enzyme complexes harboring the function of more than one enzyme needed for the hydrolysis process. There are various strategies for the degradation of complex biomass ranging from the regulation of the enzymes involved, to cellulosomes, and proteins harboring more than one enzymatic activity. In this review, the construction of multifunctional biomass degrading enzymes through end-to-end gene fusions, and its impact on production and activity by choosing the enzymes and linkers is assessed.

  4. A novel end-to-end fault detection and localization protocol for wavelength-routed WDM networks

    NASA Astrophysics Data System (ADS)

    Zeng, Hongqing; Vukovic, Alex; Huang, Changcheng

    2005-09-01

    Recently the wavelength division multiplexing (WDM) networks are becoming prevalent for telecommunication networks. However, even a very short disruption of service caused by network faults may lead to high data loss in such networks due to the high date rates, increased wavelength numbers and density. Therefore, the network survivability is critical and has been intensively studied, where fault detection and localization is the vital part but has received disproportional attentions. In this paper we describe and analyze an end-to-end lightpath fault detection scheme in data plane with the fault notification in control plane. The endeavor is focused on reducing the fault detection time. In this protocol, the source node of each lightpath keeps sending hello packets to the destination node exactly following the path for data traffic. The destination node generates an alarm once a certain number of consecutive hello packets are missed within a given time period. Then the network management unit collects all alarms and locates the faulty source based on the network topology, as well as sends fault notification messages via control plane to either the source node or all upstream nodes along the lightpath. The performance evaluation shows such a protocol can achieve fast fault detection, and at the same time, the overhead brought to the user data by hello packets is negligible.

  5. Fast Dictionary-Based Reconstruction for Diffusion Spectrum Imaging

    PubMed Central

    Bilgic, Berkin; Chatnuntawech, Itthi; Setsompop, Kawin; Cauley, Stephen F.; Yendiki, Anastasia; Wald, Lawrence L.; Adalsteinsson, Elfar

    2015-01-01

    Diffusion Spectrum Imaging (DSI) reveals detailed local diffusion properties at the expense of substantially long imaging times. It is possible to accelerate acquisition by undersampling in q-space, followed by image reconstruction that exploits prior knowledge on the diffusion probability density functions (pdfs). Previously proposed methods impose this prior in the form of sparsity under wavelet and total variation (TV) transforms, or under adaptive dictionaries that are trained on example datasets to maximize the sparsity of the representation. These compressed sensing (CS) methods require full-brain processing times on the order of hours using Matlab running on a workstation. This work presents two dictionary-based reconstruction techniques that use analytical solutions, and are two orders of magnitude faster than the previously proposed dictionary-based CS approach. The first method generates a dictionary from the training data using Principal Component Analysis (PCA), and performs the reconstruction in the PCA space. The second proposed method applies reconstruction using pseudoinverse with Tikhonov regularization with respect to a dictionary. This dictionary can either be obtained using the K-SVD algorithm, or it can simply be the training dataset of pdfs without any training. All of the proposed methods achieve reconstruction times on the order of seconds per imaging slice, and have reconstruction quality comparable to that of dictionary-based CS algorithm. PMID:23846466

  6. End-to-end sensor simulation for spectral band selection and optimization with application to the Sentinel-2 mission.

    PubMed

    Segl, Karl; Richter, Rudolf; Küster, Theres; Kaufmann, Hermann

    2012-02-01

    An end-to-end sensor simulation is a proper tool for the prediction of the sensor's performance over a range of conditions that cannot be easily measured. In this study, such a tool has been developed that enables the assessment of the optimum spectral resolution configuration of a sensor based on key applications. It employs the spectral molecular absorption and scattering properties of materials that are used for the identification and determination of the abundances of surface and atmospheric constituents and their interdependence on spatial resolution and signal-to-noise ratio as a basis for the detailed design and consolidation of spectral bands for the future Sentinel-2 sensor. The developed tools allow the computation of synthetic Sentinel-2 spectra that form the frame for the subsequent twofold analysis of bands in the atmospheric absorption and window regions. One part of the study comprises the assessment of optimal spatial and spectral resolution configurations for those bands used for atmospheric correction, optimized with regard to the retrieval of aerosols, water vapor, and the detection of cirrus clouds. The second part of the study presents the optimization of thematic bands, mainly driven by the spectral characteristics of vegetation constituents and minerals. The investigation is performed for different wavelength ranges because most remote sensing applications require the use of specific band combinations rather than single bands. The results from the important "red-edge" and the "short-wave infrared" domains are presented. The recommended optimum spectral design predominantly confirms the sensor parameters given by the European Space Agency. The system is capable of retrieving atmospheric and geobiophysical parameters with enhanced quality compared to existing multispectral sensors. Minor spectral changes of single bands are discussed in the context of typical remote sensing applications, supplemented by the recommendation of a few new bands for

  7. End-to-end sensor simulation for spectral band selection and optimization with application to the Sentinel-2 mission.

    PubMed

    Segl, Karl; Richter, Rudolf; Küster, Theres; Kaufmann, Hermann

    2012-02-01

    An end-to-end sensor simulation is a proper tool for the prediction of the sensor's performance over a range of conditions that cannot be easily measured. In this study, such a tool has been developed that enables the assessment of the optimum spectral resolution configuration of a sensor based on key applications. It employs the spectral molecular absorption and scattering properties of materials that are used for the identification and determination of the abundances of surface and atmospheric constituents and their interdependence on spatial resolution and signal-to-noise ratio as a basis for the detailed design and consolidation of spectral bands for the future Sentinel-2 sensor. The developed tools allow the computation of synthetic Sentinel-2 spectra that form the frame for the subsequent twofold analysis of bands in the atmospheric absorption and window regions. One part of the study comprises the assessment of optimal spatial and spectral resolution configurations for those bands used for atmospheric correction, optimized with regard to the retrieval of aerosols, water vapor, and the detection of cirrus clouds. The second part of the study presents the optimization of thematic bands, mainly driven by the spectral characteristics of vegetation constituents and minerals. The investigation is performed for different wavelength ranges because most remote sensing applications require the use of specific band combinations rather than single bands. The results from the important "red-edge" and the "short-wave infrared" domains are presented. The recommended optimum spectral design predominantly confirms the sensor parameters given by the European Space Agency. The system is capable of retrieving atmospheric and geobiophysical parameters with enhanced quality compared to existing multispectral sensors. Minor spectral changes of single bands are discussed in the context of typical remote sensing applications, supplemented by the recommendation of a few new bands for

  8. Results from Solar Reflective Band End-to-End Testing for VIIRS F1 Sensor Using T-SIRCUS

    NASA Technical Reports Server (NTRS)

    McIntire, Jeff; Moyer, David; McCarthy, James K.; DeLuccia, Frank; Xiong, Xiaoxiong; Butler, James J.; Guenther, Bruce

    2011-01-01

    Verification of the Visible Infrared Imager Radiometer Suite (VIIRS) End-to-End (E2E) sensor calibration is highly recommended before launch, to identify any anomalies and to improve our understanding of the sensor on-orbit calibration performance. E2E testing of the Reflective Solar Bands (RSB) calibration cycle was performed pre-launch for the VIIRS Fight 1 (F1) sensor at the Ball Aerospace facility in Boulder CO in March 2010. VIIRS reflective band calibration cycle is very similar to heritage sensor MODIS in that solar illumination, via a diffuser, is used to correct for temporal variations in the instrument responsivity. Monochromatic light from the NIST T-SIRCUS was used to illuminate both the Earth View (EV), via an integrating sphere, and the Solar Diffuser (SD) view, through a collimator. The collimator illumination was cycled through a series of angles intended to simulate the range of possible angles for which solar radiation will be incident on the solar attenuation screen on-orbit. Ideally, the measured instrument responsivity (defined here as the ratio of the detector response to the at-sensor radiance) should be the same whether the EV or SD view is illuminated. The ratio of the measured responsivities was determined at each collimator angle and wavelength. In addition, the Solar Diffuser Stability Monitor (SDSM), a ratioing radiometer designed to track the temporal variation in the SD BRF by direct comparison to solar radiation, was illuminated by the collimator. The measured SDSM ratio was compared to the predicted ratio. An uncertainty analysis was also performed on both the SD and SDSM calibrations.

  9. Computational simulation of flow in the end-to-end anastomosis of a rigid graft and a compliant artery.

    PubMed

    Qiu, Y; Tarbell, J M

    1996-01-01

    Implanted vascular grafts often fail because of the development of intimal hyperplasia in the anastomotic region, and compliance mismatch between the host artery and graft exacerbates the problem. This study focused on the effects of radial artery wall motion and phase angle between pressure and flow waves (impedance phase angle [IPA]) on the wall shear rate (WSR) behavior near end-to-end vascular graft anastomoses models connecting rigid grafts and compliant arteries. A finite element model with transient flow and moving boundaries was set up to simulate oscillatory flow through a 16% undersized (mean) diameter graft model. During the simulations, different artery diameter variations (DVs) over a cycle (DV) and IPAs were simulated in the physiologic range for an oscillatory flow (mean Re = 150, peak Re = 300, unsteadiness parameter alpha = 3.9). The results show that for normal physiologic conditions (DV = 6%, IPA = -45 degrees) in a 16% undersized graft, the minimum distal mean WSR is reduced by 60% compared to steady flow at the mean Re; the minimum distal WSR amplitude increases 50% when IPA changes from -5 degrees to -85 degrees, and increases 60% when DV changes from 2% to 10%. This indicates that compliance mismatch induces lower mean WSR and more oscillatory WSR in the distal anastomotic region, which may contribute to intimal hyperplasia. In addition, the convergent-divergent geometry of the 16% undersized graft model can significantly affect the force pattern applied to the local endothelial cell layer near the anastomosis by altering the local phase angle between the flow induced tangential force (synchronous with WSR) and the radial artery expansion induced cyclic hoop strain (synchronous with DV). This local phase angle is decreased by 65 degrees in the distal divergent geometry, while increased by 15 degrees in the proximal convergent geometry. PMID:8944971

  10. End-to-end gene fusions and their impact on the production of multifunctional biomass degrading enzymes

    SciTech Connect

    Rizk, Mazen; Antranikian, Garabed; Elleuche, Skander

    2012-11-09

    Highlights: Black-Right-Pointing-Pointer Multifunctional enzymes offer an interesting approach for biomass degradation. Black-Right-Pointing-Pointer Size and conformation of separate constructs play a role in the effectiveness of chimeras. Black-Right-Pointing-Pointer A connecting linker allows for maximal flexibility and increased thermostability. Black-Right-Pointing-Pointer Genes with functional similarities are the best choice for fusion candidates. -- Abstract: The reduction of fossil fuels, coupled with its increase in price, has made the search for alternative energy resources more plausible. One of the topics gaining fast interest is the utilization of lignocellulose, the main component of plants. Its primary constituents, cellulose and hemicellulose, can be degraded by a series of enzymes present in microorganisms, into simple sugars, later used for bioethanol production. Thermophilic bacteria have proven to be an interesting source of enzymes required for hydrolysis since they can withstand high and denaturing temperatures, which are usually required for processes involving biomass degradation. However, the cost associated with the whole enzymatic process is staggering. A solution for cost effective and highly active production is through the construction of multifunctional enzyme complexes harboring the function of more than one enzyme needed for the hydrolysis process. There are various strategies for the degradation of complex biomass ranging from the regulation of the enzymes involved, to cellulosomes, and proteins harboring more than one enzymatic activity. In this review, the construction of multifunctional biomass degrading enzymes through end-to-end gene fusions, and its impact on production and activity by choosing the enzymes and linkers is assessed.

  11. WE-G-BRD-08: End-To-End Targeting Accuracy of the Gamma Knife for Trigeminal Neuralgia

    SciTech Connect

    Brezovich, I; Wu, X; Duan, J; Benhabib, S; Huang, M; Shen, S; Cardan, R; Popple, R

    2014-06-15

    Purpose: Current QA procedures verify accuracy of individual equipment parameters, but may not include CT and MRI localizers. This study uses an end-to-end approach to measure the overall targeting errors in individual patients previously treated for trigeminal neuralgia. Methods: The trigeminal nerve is simulated by a 3 mm long, 3.175 mm (1/8 inch) diameter MRI contrast-filled cavity embedded within a PMMA plastic capsule. The capsule is positioned within the head frame such that the cavity position matches the Gamma Knife coordinates of 10 previously treated patients. Gafchromic EBT2 film is placed at the center of the cavity in coronal and sagittal orientations. The films are marked with a pin prick to identify the cavity center. Treatments are planned for delivery with 4 mm collimators using MRI and CT scans acquired with the clinical localizer boxes and acquisition protocols. Coordinates of shots are chosen so that the cavity is centered within the 50% isodose volume. Following irradiation, the films are scanned and analyzed. Targeting errors are defined as the distance between the pin prick and the centroid of the 50% isodose line. Results: Averaged over 10 patient simulations, targeting errors along the x, y and z coordinates (patient left-to-right, posterior-anterior, head-to-foot) were, respectively, −0.060 +/− 0.363, −0.350 +/− 0.253, and 0.364 +/− 0.191 mm when MRI was used for treatment planning. Planning according to CT exhibited generally smaller errors, namely 0.109 +/− 0.167, −0.191 +/− 0.144, and 0.211 +/− 0.94 mm. The largest errors in MRI and CT planned treatments were, respectively, y = −0.761 and x = 0.428 mm. Conclusion: Unless patient motion or stronger MRI image distortion in actual treatments caused additional errors, all patients received the prescribed dose, i.e., the targeted section of the trig±eminal nerve was contained within the 50% isodose surface in all cases.

  12. SBSS Demonstrator: A design for efficient demonstration of Space-based Space Surveillance end-to-end capabilities

    NASA Astrophysics Data System (ADS)

    Utzmann, Jens; Flohrer, Tim; Schildknecht, Thomas; Wagner, Axel; Silha, Jiri; Willemsen, Philip; Teston, Frederic

    This paper presents the capabilities of a Space-Based Space Surveillance (SBSS) demonstration mission for Space Surveillance and Tracking (SST) based on a micro-satellite platform. The results have been produced in the frame of ESA’s "Assessment Study for Space Based Space Surveillance Demonstration Mission" performed by the Airbus Defence and Space consortium. Space Surveillance and Tracking is part of Space Situational Awareness (SSA) and covers the detection, tracking and cataloguing of space debris and satellites. Derived SST services comprise a catalogue of these man-made objects, collision warning, detection and characterisation of in-orbit fragmentations, sub-catalogue debris characterisation, etc. The assessment of SBSS in a SST system architecture has shown that both an operational SBSS and also already a well-designed space-based demonstrator can provide substantial performance in terms of surveillance and tracking of beyond-LEO objects. Especially the early deployment of a demonstrator, possible by using standard equipment, could boost initial operating capability and create a self-maintained object catalogue. Furthermore, unique statistical information about small-size LEO debris (mm size) can be collected in-situ. Unlike classical technology demonstration missions, the primary goal is the demonstration and optimisation of the functional elements in a complex end-to-end chain (mission planning, observation strategies, data acquisition, processing and fusion, etc.) until the final products can be offered to the users. Also past and current missions by the US (SBV, SBSS) and Canada (Sapphire, NEOSSat) underline the advantages of space-based space surveillance. The presented SBSS system concept takes the ESA SST System Requirements (derived within the ESA SSA Preparatory Program) into account and aims at fulfilling SST core requirements in a stand-alone manner. Additionally, requirments for detection and characterisation of small-sized LEO debris are

  13. Volumetric-Modulated Arc Therapy: Effective and Efficient End-to-End Patient-Specific Quality Assurance

    SciTech Connect

    O'Daniel, Jennifer; Das, Shiva; Wu, Q. Jackie; Yin Fangfang

    2012-04-01

    Purpose: To explore an effective and efficient end-to-end patient-specific quality-assurance (QA) protocol for volumetric modulated arc radiotherapy (VMAT) and to evaluate the suitability of a stationary radiotherapy QA device (two-dimensional [2D] ion chamber array) for VMAT QA. Methods and Materials: Three methods were used to analyze 39 VMAT treatment plans for brain, spine, and prostate: ion chamber (one-dimensional absolute, n = 39), film (2D relative, coronal/sagittal, n = 8), and 2D ion chamber array (ICA, 2D absolute, coronal/sagittal, n = 39) measurements. All measurements were compared with the treatment planning system dose calculation either via gamma analysis (3%, 3- to 4-mm distance-to-agreement criteria) or absolute point dose comparison. The film and ion chamber results were similarly compared with the ICA measurements. Results: Absolute point dose measurements agreed well with treatment planning system computed doses (ion chamber: median deviation, 1.2%, range, -0.6% to 3.3%; ICA: median deviation, 0.6%, range, -1.8% to 2.9%). The relative 2D dose measurements also showed good agreement with computed doses (>93% of pixels in all films passing gamma, >90% of pixels in all ICA measurements passing gamma). The ICA relative dose results were highly similar to those of film (>90% of pixels passing gamma). The coronal and sagittal ICA measurements were statistically indistinguishable by the paired t test with a hypothesized mean difference of 0.1%. The ion chamber and ICA absolute dose measurements showed a similar trend but had disparities of 2-3% in 18% of plans. Conclusions: After validating the new VMAT implementation with ion chamber, film, and ICA, we were able to maintain an effective yet efficient patient-specific VMAT QA protocol by reducing from five (ion chamber, film, and ICA) to two measurements (ion chamber and single ICA) per plan. The ICA (Matrixx Registered-Sign , IBA Dosimetry) was validated for VMAT QA, but ion chamber measurements are

  14. Pre-Launch End-to-End Testing Plans for the SPAce Readiness Coherent Lidar Experiment (SPARCLE)

    NASA Technical Reports Server (NTRS)

    Kavaya, Michael J.

    1999-01-01

    The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing. attitude information from lidar and non-lidar sensors, and pointing knowledge algorithms will

  15. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum.

    PubMed

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  16. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum

    PubMed Central

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-01-01

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations. PMID:27110781

  17. Reconstructing Face Image from the Thermal Infrared Spectrum to the Visible Spectrum.

    PubMed

    Kresnaraman, Brahmastro; Deguchi, Daisuke; Takahashi, Tomokazu; Mekada, Yoshito; Ide, Ichiro; Murase, Hiroshi

    2016-04-21

    During the night or in poorly lit areas, thermal cameras are a better choice instead of normal cameras for security surveillance because they do not rely on illumination. A thermal camera is able to detect a person within its view, but identification from only thermal information is not an easy task. The purpose of this paper is to reconstruct the face image of a person from the thermal spectrum to the visible spectrum. After the reconstruction, further image processing can be employed, including identification/recognition. Concretely, we propose a two-step thermal-to-visible-spectrum reconstruction method based on Canonical Correlation Analysis (CCA). The reconstruction is done by utilizing the relationship between images in both thermal infrared and visible spectra obtained by CCA. The whole image is processed in the first step while the second step processes patches in an image. Results show that the proposed method gives satisfying results with the two-step approach and outperforms comparative methods in both quality and recognition evaluations.

  18. Investigating end-to-end accuracy of image guided radiation treatment delivery using a micro-irradiator.

    PubMed

    Rankine, L J; Newton, J; Bache, S T; Das, S K; Adamovics, J; Kirsch, D G; Oldham, M

    2013-11-01

    the irradiator was verified to be within 0.5 mm (or 1.0 mm for the 5.0 mm cone) and the cone alignment was verified to be within 0.2 mm (or 0.4 mm for the 1.0 mm cone). The PRESAGE®/DMOS system proved valuable for end-to-end verification of small field IGRT capabilities.

  19. Reconstruction of the primordial power spectrum by direct inversion

    SciTech Connect

    Nicholson, Gavin; Contaldi, Carlo R.; Paykari, Paniez E-mail: c.contaldi@imperial.ac.uk

    2010-01-01

    We introduce a new method for reconstructing the primordial power spectrum, P(k), directly from observations of the Cosmic Microwave Background (CMB). We employ Singular Value Decomposition (SVD) to invert the radiation perturbation transfer function. The degeneracy of the multipole l to wavenumber k linear mapping is thus reduced. This enables the inversion to be carried out at each point along a Monte Carlo Markov Chain (MCMC) exploration of the combined P(k) and cosmological parameter space. We present best-fit P(k) obtained with this method along with other cosmological parameters.

  20. Reconstruction of a Broadband Spectrum of Alfvenic Fluctuations

    NASA Technical Reports Server (NTRS)

    Vinas, Adolfo F.; Fuentes, Pablo S. M.; Araneda, Jaime A.; Maneva, Yana G.

    2014-01-01

    Alfvenic fluctuations in the solar wind exhibit a high degree of velocities and magnetic field correlations consistent with Alfven waves propagating away and toward the Sun. Two remarkable properties of these fluctuations are the tendencies to have either positive or negative magnetic helicity (-1 less than or equal to sigma(sub m) less than or equal to +1) associated with either left- or right- topological handedness of the fluctuations and to have a constant magnetic field magnitude. This paper provides, for the first time, a theoretical framework for reconstructing both the magnetic and velocity field fluctuations with a divergence-free magnetic field, with any specified power spectral index and normalized magnetic- and cross-helicity spectrum field fluctuations for any plasma species. The spectrum is constructed in the Fourier domain by imposing two conditions-a divergence-free magnetic field and the preservation of the sense of magnetic helicity in both spaces-as well as using Parseval's theorem for the conservation of energy between configuration and Fourier spaces. Applications to the one-dimensional spatial Alfvenic propagation are presented. The theoretical construction is in agreement with typical time series and power spectra properties observed in the solar wind. The theoretical ideas presented in this spectral reconstruction provide a foundation for more realistic simulations of plasma waves, solar wind turbulence, and the propagation of energetic particles in such fluctuating fields.

  1. Effective reconstruction of dynamics of medium response spectrum

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Varentsova, Svetlana A.

    2008-10-01

    A new algorithm is suggested to visualize the dynamics of medium response spectrum in terahertz diapason by the singly measured set of partially intersected integral characteristics of the signal. The algorithm is based on SVD method and window sliding method. The analysis, we carried out, demonstrates many advantages of the new algorithm in com-parison with the Gabor-Fourier approach, which allows obtaining the dynamics of only one spectral line for one set of measurements. Among which it is necessary to mention the possibility to get the dynamics of many spectral components simultaneously for one set of measurements as well and therefore to get the complete information about the spectrum dynamics. This allows to identify specific materials with known spectral lines and to distinguish materials with similar spectra, which is of great importance for the detection and identification of different chemicals, pharmaceutical substances and explosives. To demonstrate the efficiency of a proposed algorithm, we compare spectrum dynamics of chocolate and soap, which possess the similar spectra. Our investigation shows that their dynamics widely vary in spec-tral lines. The proposed algorithm can be also applied to voice identification and to reconstruction of a laser beam profile with a great number of local maxima. Developed algorithm allows to measure the characteristic time of medium responce. It is very important for various problems of spectroscopy.

  2. Reconstructing the primordial power spectrum from the CMB

    NASA Astrophysics Data System (ADS)

    Gauthier, Christopher; Bucher, Martin

    2012-10-01

    We propose a straightforward and model independent methodology for characterizing the sensitivity of CMB and other experiments to wiggles, irregularities, and features in the primordial power spectrum. Assuming that the primordial cosmological perturbations are adiabatic, we present a function space generalization of the usual Fisher matrix formalism applied to a CMB experiment resembling Planck with and without ancillary data. This work is closely related to other work on recovering the inflationary potential and exploring specific models of non-minimal, or perhaps baroque, primordial power spectra. The approach adopted here, however, most directly expresses what the data is really telling us. We explore in detail the structure of the available information and quantify exactly what features can be reconstructed and at what statistical significance.

  3. Investigating end-to-end security in the fifth generation wireless capabilities and IoT extensions

    NASA Astrophysics Data System (ADS)

    Uher, J.; Harper, J.; Mennecke, R. G.; Patton, P.; Farroha, B.

    2016-05-01

    The emerging 5th generation wireless network will be architected and specified to meet the vision of allowing the billions of devices and millions of human users to share spectrum to communicate and deliver services. The expansion of wireless networks from its current role to serve these diverse communities of interest introduces new paradigms that require multi-tiered approaches. The introduction of inherently low security components, like IoT devices, necessitates that critical data be better secured to protect the networks and users. Moreover high-speed communications that are meant to enable the autonomous vehicles require ultra reliable and low latency paths. This research explores security within the proposed new architectures and the cross interconnection of the highly protected assets with low cost/low security components forming the overarching 5th generation wireless infrastructure.

  4. End-to-End System Test and Optical Performance Evaluation for the Solar and Heliosphere Observatory (SOHO) Ultraviolet Coronagraph Spectrometer (UVCS)

    NASA Technical Reports Server (NTRS)

    Carosso, Paolo A.; Gardner, Larry D.; Jhabvala, Marzy; Nicolosi, P.

    1997-01-01

    The UVCS is one of the instruments carried by the Solar and Heliospheric Observatory (SOHO), a joint NASA/ESA Spacecraft launched in November 1995. It is designed to perform ultraviolet spectroscopy and visible light polarimetry of the extended solar corona. The primary scientific objectives of the UVCS investigation are to study the physical processes occurring in the extended solar corona, such as: the mechanism of acceleration of the solar wind, the mechanism of coronal plasma heating, the identification of solar wind sources, and the investigation of the plasma properties of the solar wind. The UVCS End-to-End test activities included a comprehensive set of system level functional and optical tests. Although performed under severe schedule constraints, the End-to-End System Test was very successful and served to fully validate the UVCS optical design. All test results showed that the primary scientific objectives of the UVCS Mission were achievable.

  5. Including 10-Gigabit-capable Passive Optical Network under End-to-End Generalized Multi-Protocol Label Switching Provisioned Quality of Service

    NASA Astrophysics Data System (ADS)

    Brewka, Lukasz; Gavler, Anders; Wessing, Henrik; Dittmann, Lars

    2012-04-01

    End-to-end quality of service provisioning is still a challenging task despite many years of research and development in this area. Considering a generalized multi-protocol label switching based core/metro network and resource reservation protocol capable home gateways, it is the access part of the network where quality of service signaling is bridged. This article proposes strategies for generalized multi-protocol label switching control over next emerging passive optical network standard, i.e., the 10-gigabit-capable passive optical network. Node management and resource allocation approaches are discussed, and possible issues are raised. The analysis shows that consideration of a 10-gigabit-capable passive optical network as a generalized multi-protocol label switching controlled domain is valid and may advance end-to-end quality of service provisioning for passive optical network based customers.

  6. Modelling and simulation of the mechanical response of a Dacron graft in the pressurization test and an end-to-end anastomosis.

    PubMed

    Bustos, Claudio A; García-Herrera, Claudio M; Celentano, Diego J

    2016-08-01

    This work presents the modeling and simulation of the mechanical response of a Dacron graft in the pressurization test and its clinical application in the analysis of an end-to-end anastomosis. Both problems are studied via an anisotropic constitutive model that was calibrated by means of previously reported uniaxial tensile tests. First, the simulation of the pressurization test allows the validation of the experimental material characterization that included tests carried out for different levels of axial stretching. Then, the analysis of an end-to-end anastomosis under an idealized geometry is proposed. This case consists in evaluating the mechanical performance of the graft together with the stresses and deformations in the neighborhood of the Dacron with the artery. This research contributes important data to understand the functioning of the graft and the possibility of extending the analysis to complex numerical cases like its insertion in the aortic arch. PMID:26826765

  7. Synthetic molecular machine based on reversible end-to-interior and end-to-end loop formation triggered by electrochemical stimuli.

    PubMed

    Lee, Jae Wook; Hwang, Ilha; Jeon, Woo Sung; Ko, Young Ho; Sakamoto, Shigeru; Yamaguchi, Kentaro; Kim, Kimoon

    2008-09-01

    We have designed and synthesized a novel [2]pseudorotaxane-based molecular machine in which the interconversion between end-to-interior and end-to-end loop structures is reversibly controlled by electrochemical stimuli. Cucurbit[8]uril (CB[8]) and the thread molecule 3(4+) with an electron-rich hydroxynaphthalene unit and two electron-deficient viologen units form the 1:1 complex 4(4+) with an end-to-interior loop structure, which is reversibly converted into an end-to-end structure upon reduction. Large changes in shape and size of the molecule accompany the reversible redox process. The key feature of the machine-like behavior is the reversible interconversion between an intramolecular charge-transfer complex and viologen cation radical dimer inside CB[8] triggered by electrochemical stimuli.

  8. Modelling and simulation of the mechanical response of a Dacron graft in the pressurization test and an end-to-end anastomosis.

    PubMed

    Bustos, Claudio A; García-Herrera, Claudio M; Celentano, Diego J

    2016-08-01

    This work presents the modeling and simulation of the mechanical response of a Dacron graft in the pressurization test and its clinical application in the analysis of an end-to-end anastomosis. Both problems are studied via an anisotropic constitutive model that was calibrated by means of previously reported uniaxial tensile tests. First, the simulation of the pressurization test allows the validation of the experimental material characterization that included tests carried out for different levels of axial stretching. Then, the analysis of an end-to-end anastomosis under an idealized geometry is proposed. This case consists in evaluating the mechanical performance of the graft together with the stresses and deformations in the neighborhood of the Dacron with the artery. This research contributes important data to understand the functioning of the graft and the possibility of extending the analysis to complex numerical cases like its insertion in the aortic arch.

  9. End-to-end small bowel anastomosis by temperature controlled CO2 laser soldering and an albumin stent: a feasibility study

    NASA Astrophysics Data System (ADS)

    Simhon, David; Kopelman, Doron; Hashmonai, Moshe; Vasserman, Irena; Dror, Michael; Vasilyev, Tamar; Halpern, Marissa; Kariv, Naam; Katzir, Abraham

    2004-07-01

    Introduction: A feasibility study of small intestinal end to end anastomosis was performed in a rabbit model using temperature controlled CO2 laser system and an albumin stent. Compared with standard suturing or clipping, this method does not introduce foreign materials to the repaired wound and therefore, may lead to better and faster wound healing of the anastomotic site. Methods: Transected rabbits small intestines were either laser soldered using 47% bovine serum albumin and intraluminal albumin stent or served as controls in which conventional continuous two-layer end to end anastomosis was performed manually. The integrity of the anastomosis was investigated at the 14th postoperative day. Results: Postoperative course in both treatments was uneventful. The sutured group presented signs of partial bowel obstruction. Macroscopically, no signs of intraluminal fluid leakage were observed in both treatments. Yet, laser soldered intestinal anastomoses demonstrated significant superiority with respect to adhesions and narrowing of the intestinal lumen. Serial histological examinations revealed better wound healing characteristics of the laser soldered anastomotic site. Conclusion: Laser soldering of intestinal end to end anastomosis provide a faster surgical procedure, compared to standard suture technique, with better wound healing results. It is expected that this technique may be adopted in the future for minimal invasive surgeries.

  10. Wind velocity profile reconstruction from intensity fluctuations of a plane wave propagating in a turbulent atmosphere.

    PubMed

    Banakh, V A; Marakasov, D A

    2007-08-01

    Reconstruction of a wind profile based on the statistics of plane-wave intensity fluctuations in a turbulent atmosphere is considered. The algorithm for wind profile retrieval from the spatiotemporal spectrum of plane-wave weak intensity fluctuations is described, and the results of end-to-end computer experiments on wind profiling based on the developed algorithm are presented. It is shown that the reconstructing algorithm allows retrieval of a wind profile from turbulent plane-wave intensity fluctuations with acceptable accuracy.

  11. Primary and secondary structure dependence of peptide flexibility assessed by fluorescence-based measurement of end-to-end collision rates.

    PubMed

    Huang, Fang; Hudgins, Robert R; Nau, Werner M

    2004-12-22

    The intrachain fluorescence quenching of the fluorophore 2,3-diazabicyclo[2.2.2]oct-2-ene (DBO) is measured in short peptide fragments, namely the two strands and the turn of the N-terminal beta-hairpin of ubiquitin. The investigated peptides adopt a random-coil conformation in aqueous solution according to CD and NMR experiments. The combination of quenchers with different quenching efficiencies, namely tryptophan and tyrosine, allows the extrapolation of the rate constants for end-to-end collision rates as well as the dissociation of the end-to-end encounter complex. The measured activation energies for fluorescence quenching demonstrate that the end-to-end collision process in peptides is partially controlled by internal friction within the backbone, while measurements in solvents of different viscosities (H2O, D2O, and 7.0 M guanidinium chloride) suggest that solvent friction is an additional important factor in determining the collision rate. The extrapolated end-to-end collision rates, which are only slightly larger than the experimental rates for the DBO/Trp probe/quencher system, provide a measure of the conformational flexibility of the peptide backbone. The chain flexibility is found to be strongly dependent on the type of secondary structure that the peptides represent. The collision rates for peptides derived from the beta-strand motifs (ca. 1 x 10(7) s(-1)) are ca. 4 times slower than that derived from the beta-turn. The results provide further support for the hypothesis that chain flexibility is an important factor in the preorganization of protein fragments during protein folding. Mutations to the beta-turn peptide show that subtle sequence changes strongly affect the flexibility of peptides as well. The protonation and charge status of the peptides, however, are shown to have no significant effect on the flexibility of the investigated peptides. The meaning and definition of end-to-end collision rates in the context of protein folding are critically

  12. Differentiated CW Policy and Strict Priority Policy for Location-Independent End-to-End Delay in Multi-Hop Wireless Mesh Networks

    NASA Astrophysics Data System (ADS)

    Bae, Yun Han; Kim, Kyung Jae; Park, Jin Soo; Choi, Bong Dae

    We investigate delay analysis of multi-hop wireless mesh network (WMN) where nodes have multi-channel and multiple transceivers to increase the network capacity. The functionality of the multi-channel and multiple transceivers allows the whole WMN to be decomposed into disjoint zones in such a way that i) nodes in a zone are within one-hop distance, and relay node and end nodes with different CWmins contend to access the channel based on IEEE 802.11e EDCA, ii) different channels are assigned to neighbor zones to prevent the hidden node problem, iii) relay nodes can transmit and receive the packets simultaneously by multi-channel and multiple transceivers. With this decomposition of the network, we focus on the delay at a single zone and then the end-to-end delay can be obtained as the sum of zone-delays. In order to have the location-independent end-to-end delay to the gateway regardless of source nodes' locations, we propose two packet management schemes, called the differentiated CW policy and the strict priority policy, at each relay node where relay packets with longer hop count are buffered in higher priority queues according to their experienced hop count. For the differentiated CW policy, a relay node adopts the functionality of IEEE 802.11e EDCA where a higher priority queue has a shorter minimum contention window. We model a typical zone as a one-hop IEEE 802.11e EDCA network under non-saturation condition where priority queues have different packet arrival rates and different minimum contention window sizes. First, we find the PGF (probability generating function) of the HoL-delay of packets at priority queues in a zone. Second, by modeling each queue as M/G/1 queue with the HoL-delay as a service time, we obtain the packet delay (the sum of the queueing delay and the HoL-delay) of each priority queue in a zone. Third, the average end-to-end delay of packet generated at end node in each zone is obtained by summing up the packet delays at each zone. For

  13. Planning for Mars Sample Return: Results from the MEPAG Mars Sample Return End-to-End International Science Analysis Group (E2E-iSAG)

    NASA Astrophysics Data System (ADS)

    McLennan, S. M.; Sephton, M.; Mepag E2E-Isag

    2011-12-01

    The National Research Council 2011 Planetary Decadal Survey (2013-2022) placed beginning a Mars sample return campaign (MSR) as the top priority for large Flagship missions in the coming decade. Recent developments in NASA-ESA collaborations and Decadal Survey recommendations indicate MSR likely will be an international effort. A joint ESA-NASA 2018 rover (combining the previously proposed ExoMars and MAX-C missions), designed, in part, to collect and cache samples, would thus represent the first of a 3-mission MSR campaign. The End-to-End International Science Analysis Group (E2E-iSAG) was chartered by MEPAG in August 2010 to develop and prioritize MSR science objectives and investigate implications of these objectives for defining the highest priority sample types, landing site selection criteria (and identification of reference landing sites to support engineering planning), requirements for in situ characterization on Mars to support sample selection, and priorities/strategies for returned sample analyses to determine sample sizes and numbers that would meet the objectives. MEPAG approved the E2E-iSAG report in June 2011. Science objectives, summarized in priority order, are: (1) critically assess any evidence for past life or its chemical precursors, and place constraints on past habitability and potential for preservation of signs of life, (2) quantitatively constrain age, context and processes of accretion, early differentiation and magmatic and magnetic history, (3) reconstruct history of surface and near-surface processes involving water, (4) constrain magnitude, nature, timing, and origin of past climate change, (5) assess potential environmental hazards to future human exploration, (6) assess history and significance of surface modifying processes, (7) constrain origin and evolution of the Martian atmosphere, (8) evaluate potential critical resources for future human explorers. All returned samples also would be fully evaluated for extant life as a

  14. The application of MUSIC algorithm in spectrum reconstruction and interferogram processing

    NASA Astrophysics Data System (ADS)

    Jian, Xiaohua; Zhang, Chunmin; Zhao, Baochang; Zhu, Baohui

    2008-05-01

    Three different methods of spectrum reproduction and interferogram processing are discussed and contrasted in this paper. Especially, the nonparametric model of MUSIC (multiple signal classification) algorithm is firstly brought into the practical spectrum reconstruction processing. The experimental results prove that this method has immensely improved the resolution of reproduced spectrum, and provided a better math model for super advanced resolving power in spectrum reconstruction. The usefulness and simplicity of the technique will lead the interference imaging spectrometers to almost every field into which the spectroscopy has ventured and into some where it has not gone before.

  15. Clinical evaluation of a closed, one-stage, stapled, functional, end-to-end jejuno-ileal anastomosis in 5 horses

    PubMed Central

    Anderson, Stacy L.; Blackford, James T.; Kelmer, S. Gal

    2012-01-01

    This study describes the outcome and complications in horses that had a closed, one-stage, stapled, functional, end-to-end (COSFE) jejuno-ileal anastomosis (JIA) following resection of compromised small intestine. Medical records were reviewed to identify all horses that had a COSFE JIA performed during exploratory laparotomy and to determine post-operative complications and final outcome. All 5 horses that were identified had successful COSFE JIA with resection of various amounts of distal jejunum and proximal ileum. Post-operative ileus occurred in 1 of the 5 horses. All horses survived at least 1 year after surgery. The survival times and incidence of post-operative ileus compared favorably with published results for other types of small intestinal resection and anastomoses. A COSFE JIA is a viable surgical procedure to correct lesions of the distal jejunum and proximal ileum. PMID:23450864

  16. End-to-End Study of the Transfer of Energy from Magnetosheath Ion Precipitation to the Ionospheric Cusp and Resulting Ion Outflow to the Magnetosphere

    NASA Technical Reports Server (NTRS)

    Coffey, Victoria; Chandler, Michael; Singh, Nagendra; Avanov, Levon

    2003-01-01

    We will show results from an end-to-end study of the energy transfer from injected magnetosheath plasmas to the near-Earth magnetospheric and ionospheric plasmas and the resulting ion outflow to the magnetosphere. This study includes modeling of the evolution of the magnetosheath precipitation in the cusp using a kinetic code with a realistic magnetic field configuration. These evolved, highly non-Maxwellian distributions are used as input to a 2D PIC code to analyze the resulting wave generation. The wave analysis is used in the kinetic code as input to the cold ionospheric ions to study the transfer of energy to these ions and their outflow to the magnetosphere. Observations from the Thermal Ion Dynamics Experiment (TIDE) and other instruments on the Polar Spacecraft will be compared to the modeling.

  17. End to end assembly of CaO and ZnO nanosheets to propeller-shaped architectures by orientation attachment approaches

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Liu, Fang

    2015-06-01

    Inspired by the agitation effect of propellers, heterogeneous propeller- shaped CaO/ZnO architectures were assembled in aqueous solution. Preferred nucleation and growth of CaO and ZnO nuclei resulted in their hexagonal nanosheets, and they were end to end combined into propeller-shaped architectures by oriented rotation and attachment reactions. When propeller-shaped CaO/ZnO product was used as solid base catalyst to synthesize biodiesel, a high biodiesel yield of 97.5% was achieved. The predominant exposure of active O2- on CaO(0 0 2) and ZnO(0 0 0 2) planes in propeller-shaped CaO/ZnO, led to good catalytic activity and high yield of biodiesel.

  18. Mixed integer nonlinear programming model of wireless pricing scheme with QoS attribute of bandwidth and end-to-end delay

    NASA Astrophysics Data System (ADS)

    Irmeilyana, Puspita, Fitri Maya; Indrawati

    2016-02-01

    The pricing for wireless networks is developed by considering linearity factors, elasticity price and price factors. Mixed Integer Nonlinear Programming of wireless pricing model is proposed as the nonlinear programming problem that can be solved optimally using LINGO 13.0. The solutions are expected to give some information about the connections between the acceptance factor and the price. Previous model worked on the model that focuses on bandwidth as the QoS attribute. The models attempt to maximize the total price for a connection based on QoS parameter. The QoS attributes used will be the bandwidth and the end to end delay that affect the traffic. The maximum goal to maximum price is achieved when the provider determine the requirement for the increment or decrement of price change due to QoS change and amount of QoS value.

  19. Demonstration of a fully-coupled end-to-end model for small pelagic fish using sardine and anchovy in the California Current

    NASA Astrophysics Data System (ADS)

    Rose, Kenneth A.; Fiechter, Jerome; Curchitser, Enrique N.; Hedstrom, Kate; Bernal, Miguel; Creekmore, Sean; Haynie, Alan; Ito, Shin-ichi; Lluch-Cota, Salvador; Megrey, Bernard A.; Edwards, Chris A.; Checkley, Dave; Koslow, Tony; McClatchie, Sam; Werner, Francisco; MacCall, Alec; Agostini, Vera

    2015-11-01

    We describe and document an end-to-end model of anchovy and sardine population dynamics in the California Current as a proof of principle that such coupled models can be developed and implemented. The end-to-end model is 3-dimensional, time-varying, and multispecies, and consists of four coupled submodels: hydrodynamics, Eulerian nutrient-phytoplankton-zooplankton (NPZ), an individual-based full life cycle anchovy and sardine submodel, and an agent-based fishing fleet submodel. A predator roughly mimicking albacore was included as individuals that consumed anchovy and sardine. All submodels were coded within the ROMS open-source community model, and used the same resolution spatial grid and were all solved simultaneously to allow for possible feedbacks among the submodels. We used a super-individual approach and solved the coupled models on a distributed memory parallel computer, both of which created challenging but resolvable bookkeeping challenges. The anchovy and sardine growth, mortality, reproduction, and movement, and the fishing fleet submodel, were each calibrated using simplified grids before being inserted into the full end-to-end model. An historical simulation of 1959-2008 was performed, and the latter 45 years analyzed. Sea surface height (SSH) and sea surface temperature (SST) for the historical simulation showed strong horizontal gradients and multi-year scale temporal oscillations related to various climate indices (PDO, NPGO), and both showed responses to ENSO variability. Simulated total phytoplankton was lower during strong El Nino events and higher for the strong 1999 La Nina event. The three zooplankton groups generally corresponded to the spatial and temporal variation in simulated total phytoplankton. Simulated biomasses of anchovy and sardine were within the historical range of observed biomasses but predicted biomasses showed much less inter-annual variation. Anomalies of annual biomasses of anchovy and sardine showed a switch in the mid

  20. The role of environmental controls in determining sardine and anchovy population cycles in the California Current: Analysis of an end-to-end model

    NASA Astrophysics Data System (ADS)

    Fiechter, Jerome; Rose, Kenneth A.; Curchitser, Enrique N.; Hedstrom, Katherine S.

    2015-11-01

    Sardine and anchovy are two forage species of particular interest because of their low-frequency cycles in adult abundance in boundary current regions, combined with a commercially relevant contribution to the global marine food catch. While several hypotheses have been put forth to explain decadal shifts in sardine and anchovy populations, a mechanistic basis for how the physics, biogeochemistry, and biology combine to produce patterns of synchronous variability across widely separated systems has remained elusive. The present study uses a 50-year (1959-2008) simulation of a fully coupled end-to-end ecosystem model configured for sardine and anchovy in the California Current System to investigate how environmental processes control their population dynamics. The results illustrate that slightly different temperature and diet preferences can lead to significantly different responses to environmental variability. Simulated adult population fluctuations are associated with age-1 growth (via age-2 egg production) and prey availability for anchovy, while they depend primarily on age-0 survival and temperature for sardine. The analysis also hints at potential linkages to known modes of climate variability, whereby changes in adult abundance are related to ENSO for anchovy and to the PDO for sardine. The connection to the PDO and ENSO is consistent with modes of interannual and decadal variability that would alternatively favor anchovy during years of cooler temperatures and higher prey availability, and sardine during years of warmer temperatures and lower prey availability. While the end-to-end ecosystem model provides valuable insight on potential relationships between environmental conditions and sardine and anchovy population dynamics, understanding the complex interplay, and potential lags, between the full array of processes controlling their abundances in the California Current System remains an on-going challenge.

  1. Profiling wind and greenhouse gases by infrared-laser occultation: algorithm and results from end-to-end simulations in windy air

    NASA Astrophysics Data System (ADS)

    Plach, A.; Proschek, V.; Kirchengast, G.

    2015-01-01

    The new mission concept of microwave and infrared-laser occultation between low-Earth-orbit satellites (LMIO) is designed to provide accurate and long-term stable profiles of atmospheric thermodynamic variables, greenhouse gases (GHGs), and line-of-sight (l.o.s.) wind speed with focus on the upper troposphere and lower stratosphere (UTLS). While the unique quality of GHG retrievals enabled by LMIO over the UTLS has been recently demonstrated based on end-to-end simulations, the promise of l.o.s. wind retrieval, and of joint GHG and wind retrieval, has not yet been analyzed in any realistic simulation setting so far. Here we describe a newly developed l.o.s. wind retrieval algorithm, which we embedded in an end-to-end simulation framework that also includes the retrieval of thermodynamic variables and GHGs, and analyze the performance of both standalone wind retrieval and joint wind and GHG retrieval. The wind algorithm utilizes LMIO laser signals placed on the inflection points at the wings of the highly symmetric C18OO absorption line near 4767 cm-1 and exploits transmission differences from wind-induced Doppler shift. Based on realistic example cases for a diversity of atmospheric conditions, ranging from tropical to high-latitude winter, we find that the retrieved l.o.s wind profiles are of high quality over the lower stratosphere under all conditions, i.e., unbiased and accurate to within about 2 m s-1 over about 15 to 35 km. The wind accuracy degrades into the upper troposphere due to decreasing signal-to-noise ratio of the wind-induced differential transmission signals. The GHG retrieval in windy air is not vulnerable to wind speed uncertainties up to about 10 m s-1 but is found to benefit in case of higher speeds from the integrated wind retrieval that enables correction of wind-induced Doppler shift of GHG signals. Overall both the l.o.s. wind and GHG retrieval results are strongly encouraging towards further development and implementation of a LMIO mission.

  2. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    NASA Astrophysics Data System (ADS)

    Bowen, S. R.; Nyflot, M. J.; Herrmann, C.; Groh, C. M.; Meyer, J.; Wollenweber, S. D.; Stearns, C. W.; Kinahan, P. E.; Sandison, G. A.

    2015-05-01

    Effective positron emission tomography / computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by six different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses, and 2%-2 mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10-20%, treatment planning errors were 5-10%, and treatment delivery errors were 5-30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5-10% in PET/CT imaging, <5% in treatment planning, and <2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT planning, and RT

  3. Imaging and dosimetric errors in 4D PET/CT-guided radiotherapy from patient-specific respiratory patterns: a dynamic motion phantom end-to-end study

    PubMed Central

    Bowen, S R; Nyflot, M J; Hermann, C; Groh, C; Meyer, J; Wollenweber, S D; Stearns, C W; Kinahan, P E; Sandison, G A

    2015-01-01

    Effective positron emission tomography/computed tomography (PET/CT) guidance in radiotherapy of lung cancer requires estimation and mitigation of errors due to respiratory motion. An end-to-end workflow was developed to measure patient-specific motion-induced uncertainties in imaging, treatment planning, and radiation delivery with respiratory motion phantoms and dosimeters. A custom torso phantom with inserts mimicking normal lung tissue and lung lesion was filled with [18F]FDG. The lung lesion insert was driven by 6 different patient-specific respiratory patterns or kept stationary. PET/CT images were acquired under motionless ground truth, tidal breathing motion-averaged (3D), and respiratory phase-correlated (4D) conditions. Target volumes were estimated by standardized uptake value (SUV) thresholds that accurately defined the ground-truth lesion volume. Non-uniform dose-painting plans using volumetrically modulated arc therapy (VMAT) were optimized for fixed normal lung and spinal cord objectives and variable PET-based target objectives. Resulting plans were delivered to a cylindrical diode array at rest, in motion on a platform driven by the same respiratory patterns (3D), or motion-compensated by a robotic couch with an infrared camera tracking system (4D). Errors were estimated relative to the static ground truth condition for mean target-to-background (T/Bmean) ratios, target volumes, planned equivalent uniform target doses (EUD), and 2%-2mm gamma delivery passing rates. Relative to motionless ground truth conditions, PET/CT imaging errors were on the order of 10–20%, treatment planning errors were 5–10%, and treatment delivery errors were 5–30% without motion compensation. Errors from residual motion following compensation methods were reduced to 5–10% in PET/CT imaging, < 5% in treatment planning, and < 2% in treatment delivery. We have demonstrated that estimation of respiratory motion uncertainty and its propagation from PET/CT imaging to RT

  4. [Incidence of painful neuroma after end-to-end nerve suture wrapped into a collagen conduit. A prospective study of 185 cases].

    PubMed

    Thomsen, L; Schlur, C

    2013-10-01

    Three to 5% of the nerves directly and correctly sutured evolve towards significant neuropathy pain. The psychological, social and economic impact of such a consequence is very important. The purpose of this retrospective study was to evaluate the incidence of the occurrence of a trigger zone or a neuroma, at 6months of maximum follow-up after direct nervous suture bushed in a type 1 collagen tube. Every patient taken care for a traumatic nervous injury from November 2008 to March 2012 was included in the study. The exclusion criteria were any replantation, nervous tissue defect and any distal nervous stump which could not technically be wrapped around. The only conduct used was made of collagen type 1 (Revolnerv(®), Orthomed™). All patients were examined after one, three and sixmonths for a clinical evaluation made by the same surgeon. The apparition of a trigger zone or a real neuroma was clinically assessed. One hundred and seventy-four patients for a total of 197 sutured nerves were included in the study. At the 6 months follow-up, 163 patients were evaluated for a total of 185 nerves. No patient suffered from a neuroma at this time. As the treatment of neuroma is very difficult, considering the cost and the results, wrapping direct end-to-end sutures by a collagen type 1 tube seems helping to prevent the appearance of a neuroma.

  5. Distributed Large Data-Object Environments: End-to-End Performance Analysis of High Speed Distributed Storage Systems in Wide Area ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnston, William; Tierney, Brian; Lee, Jason; Hoo, Gary; Thompson, Mary

    1996-01-01

    We have developed and deployed a distributed-parallel storage system (DPSS) in several high speed asynchronous transfer mode (ATM) wide area networks (WAN) testbeds to support several different types of data-intensive applications. Architecturally, the DPSS is a network striped disk array, but is fairly unique in that its implementation allows applications complete freedom to determine optimal data layout, replication and/or coding redundancy strategy, security policy, and dynamic reconfiguration. In conjunction with the DPSS, we have developed a 'top-to-bottom, end-to-end' performance monitoring and analysis methodology that has allowed us to characterize all aspects of the DPSS operating in high speed ATM networks. In particular, we have run a variety of performance monitoring experiments involving the DPSS in the MAGIC testbed, which is a large scale, high speed, ATM network and we describe our experience using the monitoring methodology to identify and correct problems that limit the performance of high speed distributed applications. Finally, the DPSS is part of an overall architecture for using high speed, WAN's for enabling the routine, location independent use of large data-objects. Since this is part of the motivation for a distributed storage system, we describe this architecture.

  6. Land Mobile Satellite Service (LMSS) channel simulator: An end-to-end hardware simulation and study of the LMSS communications links

    NASA Technical Reports Server (NTRS)

    Salmasi, A. B. (Editor); Springett, J. C.; Sumida, J. T.; Richter, P. H.

    1984-01-01

    The design and implementation of the Land Mobile Satellite Service (LMSS) channel simulator as a facility for an end to end hardware simulation of the LMSS communications links, primarily with the mobile terminal is described. A number of studies are reported which show the applications of the channel simulator as a facility for validation and assessment of the LMSS design requirements and capabilities by performing quantitative measurements and qualitative audio evaluations for various link design parameters and channel impairments under simulated LMSS operating conditions. As a first application, the LMSS channel simulator was used in the evaluation of a system based on the voice processing and modulation (e.g., NBFM with 30 kHz of channel spacing and a 2 kHz rms frequency deviation for average talkers) selected for the Bell System's Advanced Mobile Phone Service (AMPS). The various details of the hardware design, qualitative audio evaluation techniques, signal to channel impairment measurement techniques, the justifications for criteria of different parameter selection in regards to the voice processing and modulation methods, and the results of a number of parametric studies are further described.

  7. Why Patencies of Femoropopliteal Bypass Grafts with Distal End-to-End Anastomosis are Comparable with End-to-Side Anastomosis

    PubMed Central

    Hoedt, Marco; How, Thien; Wittens, Cees

    2015-01-01

    Objective: Despite the theoretical favourable hemodynamic advantage of end-to-end anastomosis (ETE), femoropopliteal bypasses with distal ETE and end-to-side anastomosis (ETS) have comparable clinical patencies. We therefore studied the effects of different in vivo anastomotic configurations on hemodynamics in geometrically realistic ETE and ETS in vitro flow models to explain this phenomenon. Methods: Four ETE and two ETS models (30° and 60°) were constructed from in vivo computed tomography angiography data. With flow visualization physiological flow conditions were studied. Results: In ETS, a flow separation and recirculation zone was apparent at anastomotic edges with a shifting stagnation point between them during systole. Secondary flow patterns developed with flow deceleration and reversal. Slight out of axis geometry of all ETE resulted in flow separation and recirculation areas comparable to ETS. Vortical flow patterns were more stable in wider and longer bevelled ETE. Conclusion: Primary flow disturbances in ETE are comparable to ETS and are related to the typical sites where myointimal hyperplasia develops. In ETS, reduction of anastomosis angle will diminish flow disturbances. To reduce flow disturbances in ETE, the creation of a bulbous spatulation with resulting axial displacement of graft in relation to recipient artery should be prevented. PMID:25641036

  8. Ecosystem limits to food web fluxes and fisheries yields in the North Sea simulated with an end-to-end food web model

    NASA Astrophysics Data System (ADS)

    Heath, Michael R.

    2012-09-01

    Equilibrium yields from an exploited fish stock represent the surplus production remaining after accounting for losses due to predation. However, most estimates of maximum sustainable yield, upon which fisheries management targets are partly based, assume that productivity and predation rates are constant in time or at least stationary. This means that there is no recognition of the potential for interaction between different fishing sectors. Here, an end-to-end ecosystem model is developed to explore the possible scale and mechanisms of interactions between pelagic and demersal fishing in the North Sea. The model simulates fluxes of nitrogen between detritus, inorganic nutrient and guilds of taxa spanning phytoplankton to mammals. The structure strikes a balance between graininess in space, taxonomy and demography, and the need to constrain the parameter-count sufficiently to enable automatic parameter optimization. Simulated annealing is used to locate the maximum likelihood parameter set, given the model structure and a suite of observations of annual rates of production and fluxes between guilds. Simulations of the impact of fishery harvesting rates showed that equilibrium yields of pelagic and demersal fish were strongly interrelated due to a variety of top-down and bottom-up food web interactions. The results clearly show that management goals based on simultaneously achieving maximum sustainable biomass yields from all commercial fish stocks is simply unattainable. Trade-offs between, for example, pelagic and demersal fishery sectors and other properties of the ecosystem have to be considered in devising an overall harvesting strategy.

  9. Reconstruction of off-axis lensless Fourier transform digital holograms based on angular spectrum theory

    NASA Astrophysics Data System (ADS)

    Wang, Guangjun; Wang, Huaying; Wang, Dayong; Xie, Jianjun; Zhao, Jie

    2007-12-01

    A simple holographic high-resolution imaging system without pre-magnification, which is based on off-axis lensless Fourier transform configuration, has been developed. Experimental investigations are performed on USAF resolution test target. The method based on angular spectrum theory for reconstructing lensless Fourier hologram is given. The reconstructed results of the same hologram at different reconstructing distances are presented for what is to our knowledge the first time. Approximate diffraction limited lateral resolution is achieved. The results show that the angular spectrum method has several advantages over more commonly used Fresnel transform method. Lossless reconstruction can be achieved for any numerical aperture holograms as long as the wave field is calculated at a special reconstructing distance, which is determined by the light wavelength and the chip size and the pixel size of the CCD sensor. This is very important for reconstructing an extremely large numerical aperture hologram. Frequency-domain spectrum filtering can be applied conveniently to remove the disturbance of zero-order. The reconstructed image wave field is accurate so long as the sampling theorem is not violated. The experimental results also demonstrate that for a high quality hologram, special image processing is unnecessary to obtain a high quality image.

  10. Reconstructing phylogeny from the multifractal spectrum of mitochondrial DNA

    NASA Astrophysics Data System (ADS)

    Glazier, James A.; Raghavachari, Sridhar; Berthelsen, Cheryl L.; Skolnick, Mark H.

    1995-03-01

    Conventional methods of phylogenetic reconstruction from DNA sequences require simplified models of evolutionary dynamics. We present a method based on fractal analysis to reconstruct the evolutionary history of organisms from mitochondrial DNA sequences. We map animal mtDNA into four-dimensional random walks and estimate their long range correlations using multifractal spectra. We see systematic changes in correlations in mtDNA sequences across taxonomic lines, which translate into changes in the scaling of the random walks. We use cluster analysis to group the multifractal spectra and obtain the phylogeny of the organisms. Though our method uses no a priori assumptions and is independent of gene order, it yields phylogenetic relationships broadly consistent with established results. Several recent papers have analyzed DNA using fractal analysis and have found long range correlations. However, no one has succeeded in using them to deduce biologically significant relationships.

  11. An end-to-end examination of geometric accuracy of IGRT using a new digital accelerator equipped with onboard imaging system.

    PubMed

    Wang, Lei; Kielar, Kayla N; Mok, Ed; Hsu, Annie; Dieterich, Sonja; Xing, Lei

    2012-02-01

    The Varian's new digital linear accelerator (LINAC), TrueBeam STx, is equipped with a high dose rate flattening filter free (FFF) mode (6 MV and 10 MV), a high definition multileaf collimator (2.5 mm leaf width), as well as onboard imaging capabilities. A series of end-to-end phantom tests were performed, TrueBeam-based image guided radiation therapy (IGRT), to determine the geometric accuracy of the image-guided setup and dose delivery process for all beam modalities delivered using intensity modulated radiation therapy (IMRT) and RapidArc. In these tests, an anthropomorphic phantom with a Ball Cube II insert and the analysis software (FilmQA (3cognition)) were used to evaluate the accuracy of TrueBeam image-guided setup and dose delivery. Laser cut EBT2 films with 0.15 mm accuracy were embedded into the phantom. The phantom with the film inserted was first scanned with a GE Discovery-ST CT scanner, and the images were then imported to the planning system. Plans with steep dose fall off surrounding hypothetical targets of different sizes were created using RapidArc and IMRT with FFF and WFF (with flattening filter) beams. Four RapidArc plans (6 MV and 10 MV FFF) and five IMRT plans (6 MV and 10 MV FFF; 6 MV, 10 MV and 15 MV WFF) were studied. The RapidArc plans with 6 MV FFF were planned with target diameters of 1 cm (0.52 cc), 2 cm (4.2 cc) and 3 cm (14.1 cc), and all other plans with a target diameter of 3 cm. Both onboard planar and volumetric imaging procedures were used for phantom setup and target localization. The IMRT and RapidArc plans were then delivered, and the film measurements were compared with the original treatment plans using a gamma criteria of 3%/1 mm and 3%/2 mm. The shifts required in order to align the film measured dose with the calculated dose distributions was attributed to be the targeting error. Targeting accuracy of image-guided treatment using TrueBeam was found to be within 1 mm. For irradiation of the 3 cm target, the gammas (3%, 1

  12. Design of a satellite end-to-end mission performance simulator for imaging spectrometers and its application to the ESA's FLEX/Sentinel-3 tandem mission

    NASA Astrophysics Data System (ADS)

    Vicent, Jorge; Sabater, Neus; Tenjo, Carolina; Acarreta, Juan R.; Manzano, María.; Rivera, Juan P.; Jurado, Pedro; Franco, Raffaella; Alonso, Luis; Moreno, Jose

    2015-09-01

    The performance analysis of a satellite mission requires specific tools that can simulate the behavior of the platform; its payload; and the acquisition of scientific data from synthetic scenes. These software tools, called End-to-End Mission Performance Simulators (E2ES), are promoted by the European Space Agency (ESA) with the goal of consolidating the instrument and mission requirements as well as optimizing the implemented data processing algorithms. Nevertheless, most developed E2ES are designed for a specific satellite mission and can hardly be adapted to other satellite missions. In the frame of ESA's FLEX mission activities, an E2ES is being developed based on a generic architecture for passive optical missions. FLEX E2ES implements a state-of-the-art synthetic scene generator that is coupled with dedicated algorithms that model the platform and instrument characteristics. This work will describe the flexibility of the FLEX E2ES to simulate complex synthetic scenes with a variety of land cover classes, topography and cloud cover that are observed separately by each instrument (FLORIS, OLCI and SLSTR). The implemented algorithms allows modelling the sensor behavior, i.e. the spectral/spatial resampling of the input scene; the geometry of acquisition; the sensor noises and non-uniformity effects (e.g. stray-light, spectral smile and radiometric noise); and the full retrieval scheme up to Level-2 products. It is expected that the design methodology implemented in FLEX E2ES can be used as baseline for other imaging spectrometer missions and will be further expanded towards a generic E2ES software tool.

  13. Enzymatic reaction modulated gold nanorod end-to-end self-assembly for ultrahigh sensitively colorimetric sensing of cholinesterase and organophosphate pesticides in human blood.

    PubMed

    Lu, Linlin; Xia, Yunsheng

    2015-08-18

    We present herein the first reported self-assembly modulation of gold nanorods (AuNRs) by enzymatic reaction, which is further employed for colorimetric assays of cholinesterase (ChE) and organophosphate pesticides (OPs) in human blood. ChE catalyzes its substrate (acetylthiocholine) and produces thiocholine and acetate acid. The resulting thiols then react with the tips of the AuNRs by S-Au conjunction and prevent subsequent cysteine-induced AuNR end-to-end (EE) self-assembly. Correspondingly, the AuNR surface plasmon resonance is regulated, which results in a distinctly ratiometric signal output. Under optimal conditions, the linear range is 0.042 to 8.4 μU/mL, and the detection limit is as low as 0.018 μU/mL. As ChE is incubated with OPs, the enzymatic activity is inhibited. So, the cysteine-induced assembly is observed again. On the basis of this principle, OPs can be well determined ranging from 0.12 to 40 pM with a 0.039 pM detection limit. To our knowledge, the present quasi pU/mL level sensitivity for ChE and the quasi femtomolar level sensitivity for OPs are at least 500 and 7000 times lower than those of previous colorimetric methods, respectively. The ultrahigh sensitivity results from (1) the rational choice of anisotropic AuNRs as building blocks and reporters and (2) the specific structure of the enzymatic thiocholine. Because of ultrahigh sensitivity, serum samples are allowed to be extremely diluted in the assay. Accordingly, various nonspecific interactions, even from glutathione/cysteine, are well avoided. So, both ChE and OPs in human blood can be directly assayed without any prepurification, indicating the simplicity and practical promise of the proposed method.

  14. The Hurricane-Flood-Landslide Continuum: An Integrated, End-to-end Forecast and Warning System for Mountainous Islands in the Tropics

    NASA Astrophysics Data System (ADS)

    Golden, J.; Updike, R. G.; Verdin, J. P.; Larsen, M. C.; Negri, A. J.; McGinley, J. A.

    2004-12-01

    In the 10 days of 21-30 September 1998, Hurricane Georges left a trail of destruction in the Caribbean region and U.S. Gulf Coast. Subsequently, in the same year, Hurricane Mitch caused widespread destruction and loss of life in four Central American nations, and in December,1999 a tropical disturbance impacted the north coast of Venezuela causing hundreds of deaths and several million dollars of property loss. More recently, an off-season disturbance in the Central Caribbean dumped nearly 250 mm rainfall over Hispaniola during the 24-hr period on May 23, 2004. Resultant flash floods and debris flows in the Dominican Republic and Haiti killed at least 1400 people. In each instance, the tropical system served as the catalyst for major flooding and landslides at landfall. Our goal is to develop and transfer an end-to-end warning system for a prototype region in the Central Caribbean, specifically the islands of Puerto Rico and Hispaniola, which experience frequent tropical cyclones and other disturbances. The envisioned system would include satellite and surface-based observations to track and nowcast dangerous levels of precipitation, atmospheric and hydrological models to predict short-term runoff and streamflow changes, geological models to warn when and where landslides and debris flows are imminent, and the capability to communicate forecast guidance products via satellite to vital government offices in Puerto Rico, Haiti, and the Dominican Republic. In this paper, we shall present a preliminary proof-of-concept study for the May 21-24, 2004 floods and debris-flows over Hispaniola to show that the envisaged flow of data, models and graphical products can produce the desired warning outputs. The multidisciplinary research and technology transfer effort will require blending the talents of hydrometeorologists, geologists, remote sensing and GIS experts, and social scientists to ensure timely delivery of tailored graphical products to both weather offices and local

  15. Assessing the value of seasonal climate forecast information through an end-to-end forecasting framework: Application to U.S. 2012 drought in central Illinois

    NASA Astrophysics Data System (ADS)

    Shafiee-Jood, Majid; Cai, Ximing; Chen, Ligang; Liang, Xin-Zhong; Kumar, Praveen

    2014-08-01

    This study proposes an end-to-end forecasting framework to incorporate operational seasonal climate forecasts to help farmers improve their decisions prior to the crop growth season, which are vulnerable to unanticipated drought conditions. The framework couples a crop growth model with a decision-making model for rainfed agriculture and translates probabilistic seasonal forecasts into more user-related information that can be used to support farmers' decisions on crop type and some market choices (e.g., contracts with ethanol refinery). The regional Climate-Weather Research and Forecasting model (CWRF) driven by two operational general circulation models (GCMs) is used to provide the seasonal forecasts of weather parameters. To better assess the developed framework, CWRF is also driven by observational reanalysis data, which theoretically can be considered as the best seasonal forecast. The proposed framework is applied to the Salt Creek watershed in Illinois that experienced an extreme drought event during 2012 crop growth season. The results show that the forecasts cannot capture the 2012 drought condition in Salt Creek and therefore the suggested decisions can make farmers worse off if the suggestions are adopted. Alternatively, the optimal decisions based on reanalysis-based CWRF forecasts, which can capture the 2012 drought conditions, make farmers better off by suggesting "no-contract" with ethanol refineries. This study suggests that the conventional metric used for ex ante value assessment is not capable of providing meaningful information in the case of extreme drought. Also, it is observed that institutional interventions (e.g., crop insurance) highly influences farmers' decisions and, thereby, the assessment of forecast value.

  16. Reconstruction of the primordial power spectrum using temperature and polarisation data from multiple experiments

    SciTech Connect

    Nicholson, Gavin; Contaldi, Carlo R. E-mail: c.contaldi@imperial.ac.uk

    2009-07-01

    We develop a method to reconstruct the primordial power spectrum, P(k), using both temperature and polarisation data from the joint analysis of a number of Cosmic Microwave Background (CMB) observations. The method is an extension of the Richardson-Lucy algorithm, first applied in this context by Shafieloo and Souradeep [1]. We show how the inclusion of polarisation measurements can decrease the uncertainty in the reconstructed power spectrum. In particular, the polarisation data can constrain oscillations in the spectrum more effectively than total intensity only measurements. We apply the estimator to a compilation of current CMB results. The reconstructed spectrum is consistent with the best-fit power spectrum although we find evidence for a 'dip' in the power on scales k ≈ 0.002 Mpc{sup −1}. This feature appears to be associated with the WMAP power in the region 18 ≤ l ≤ 26 which is consistently below best-fit models. We also forecast the reconstruction for a simulated, Planck-like [2] survey including sample variance limited polarisation data.

  17. A novel PON based UMTS broadband wireless access network architecture with an algorithm to guarantee end to end QoS

    NASA Astrophysics Data System (ADS)

    Sana, Ajaz; Hussain, Shahab; Ali, Mohammed A.; Ahmed, Samir

    2007-09-01

    In this paper we proposes a novel Passive Optical Network (PON) based broadband wireless access network architecture to provide multimedia services (video telephony, video streaming, mobile TV, mobile emails etc) to mobile users. In the conventional wireless access networks, the base stations (Node B) and Radio Network Controllers (RNC) are connected by point to point T1/E1 lines (Iub interface). The T1/E1 lines are expensive and add up to operating costs. Also the resources (transceivers and T1/E1) are designed for peak hours traffic, so most of the time the dedicated resources are idle and wasted. Further more the T1/E1 lines are not capable of supporting bandwidth (BW) required by next generation wireless multimedia services proposed by High Speed Packet Access (HSPA, Rel.5) for Universal Mobile Telecommunications System (UMTS) and Evolution Data only (EV-DO) for Code Division Multiple Access 2000 (CDMA2000). The proposed PON based back haul can provide Giga bit data rates and Iub interface can be dynamically shared by Node Bs. The BW is dynamically allocated and the unused BW from lightly loaded Node Bs is assigned to heavily loaded Node Bs. We also propose a novel algorithm to provide end to end Quality of Service (QoS) (between RNC and user equipment).The algorithm provides QoS bounds in the wired domain as well as in wireless domain with compensation for wireless link errors. Because of the air interface there can be certain times when the user equipment (UE) is unable to communicate with Node B (usually referred to as link error). Since the link errors are bursty and location dependent. For a proposed approach, the scheduler at the Node B maps priorities and weights for QoS into wireless MAC. The compensations for errored links is provided by the swapping of services between the active users and the user data is divided into flows, with flows allowed to lag or lead. The algorithm guarantees (1)delay and throughput for error-free flows,(2)short term fairness

  18. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  19. SU-E-J-194: Dynamic Tumor Tracking End-To-End Testing Using a 4D Thorax Phantom and EBT3 Films

    SciTech Connect

    Su, Z; Wu, J; Li, Z; Mamalui-Hunter, M

    2014-06-01

    Purpose: To quantify the Vero linac dosimetric accuracy of the tumor dynamic tracking treatment using EBT3 film embedded in a 4D thorax phantom. Methods: A dynamic thorax phantom with tissue equivalent materials and a film insert were used in this study. The thorax phantom was scanned in 4DCT mode with a viscoil embedded in its film insert composed of lung equivalent material. Dynamic tracking planning was performed using the 50% phase CT set with 5 conformal beams at gantry angles of 330, 15, 60, 105 and 150 degrees. Each field is a 3cm by 3cm square centered at viscoil since there was no solid mass target. Total 3 different 1–2cos4 motion profiles were used with varied motion magnitude and cycle frequency. Before treatment plan irradiation, a 4D motion model of the target was established using a series of acquired fluoroscopic images and infrared markers motion positions. During irradiation, fluoroscopic image monitoring viscoil motion was performed to verify model validity. The irradiated films were scanned and the dose maps were compared to the planned Monte Carlo dose distributions. Gamma analyses using 3%–3mm, 2%–3mm, 3%–2mm, 2%–2mm criteria were performed and presented. Results: For each motion pattern, a 4D motion model was built successfully and the target tracking performance was verified with fluoroscopic monitoring of the viscoil motion and its model predicted locations. The film gamma analysis showed the average pass rates among the 3 motion profiles are 98.14%, 96.2%, 91.3% and 85.61% for 3%–3mm, 2%–3mm, 3%–2mm, 2%–2mm criteria. Conclusion: Target dynamic tracking was performed using patient-like breathing patterns in a 4D thorax phantom with EBT3 film insert and a viscoil. There was excellent agreement between acquired and planned dose distributions for all three target motion patterns. This study performed end-to-end testing and verified the treatment accuracy of tumor dynamic tracking.

  20. SU-E-J-25: End-To-End (E2E) Testing On TomoHDA System Using a Real Pig Head for Intracranial Radiosurgery

    SciTech Connect

    Corradini, N; Leick, M; Bonetti, M; Negretti, L

    2015-06-15

    Purpose: To determine the MVCT imaging uncertainty on the TomoHDA system for intracranial radiosurgery treatments. To determine the end-to-end (E2E) overall accuracy of the TomoHDA system for intracranial radiosurgery. Methods: A pig head was obtained from the butcher, cut coronally through the brain, and preserved in formaldehyde. The base of the head was fixed to a positioning plate allowing precise movement, i.e. translation and rotation, in all 6 axes. A repeatability test was performed on the pig head to determine uncertainty in the image bone registration algorithm. Furthermore, the test studied images with MVCT slice thicknesses of 1 and 3 mm in unison with differing scan lengths. A sensitivity test was performed to determine the registration algorithm’s ability to find the absolute position of known translations/rotations of the pig head. The algorithm’s ability to determine absolute position was compared against that of manual operators, i.e. a radiation therapist and radiation oncologist. Finally, E2E tests for intracranial radiosurgery were performed by measuring the delivered dose distributions within the pig head using Gafchromic films. Results: The repeatability test uncertainty was lowest for the MVCTs of 1-mm slice thickness, which measured less than 0.10 mm and 0.12 deg for all axes. For the sensitivity tests, the bone registration algorithm performed better than human eyes and a maximum difference of 0.3 mm and 0.4 deg was observed for the axes. E2E test results in absolute position difference measured 0.03 ± 0.21 mm in x-axis and 0.28 ± 0.18 mm in y-axis. A maximum difference of 0.32 and 0.66 mm was observed in x and y, respectively. The average peak dose difference between measured and calculated dose was 2.7 cGy or 0.4%. Conclusion: Our tests using a pig head phantom estimate the TomoHDA system to have a submillimeter overall accuracy for intracranial radiosurgery.

  1. SU-E-T-19: A New End-To-End Test Method for ExacTrac for Radiation and Plan Isocenter Congruence

    SciTech Connect

    Lee, S; Nguyen, N; Liu, F; Huang, Y; Jung, J; Pyakuryal, A; Jang, S

    2014-06-01

    Purpose: To combine and integrate quality assurance (QA) of target localization and radiation isocenter End to End (E2E) test of BrainLAB ExacTrac system, a new QA approach was devised using anthropomorphic head and neck phantom. This test insures the target localization as well as radiation isocenter congruence which is one step ahead the current ExacTrac QA procedures. Methods: The head and neck phantom typically used for CyberKnife E2E test was irradiated to the sphere target that was visible in CT-sim images. The CT-sim was performed using 1 mm thickness slice with helical scanning technique. The size of the sphere was 3-cm diameter and contoured as a target volume using iPlan V.4.5.2. A conformal arc plan was generated using MLC-based with 7 fields, and five of them were include couch rotations. The prescription dose was 5 Gy and 95% coverage to the target volume. For the irradiation, two Gafchromic films were perpendicularly inserted into the cube that hold sphere inside. The linac used for the irradiation was TrueBeam STx equipped with HD120 MLC. In order to use ExacTrac, infra-red head–array was used to correlate orthogonal X-ray images. Results: Using orthogonal X-rays of ExacTrac the phantom was positioned. For each field, phantom was check again with X-rays and re-positioned if necessary. After each setup using ExacTrac, the target was irradiated. The films were analyzed to determine the deviation of the radiation isocenter in all three dimensions: superior-inferior, left-right and anterior-posterior. The total combining error was found to be 0.76 mm ± 0.05 mm which was within sub-millimeter accuracy. Conclusion: Until now, E2E test for ExacTrac was separately implemented to test image localization and radiation isocenter. This new method can be used for periodic QA procedures.

  2. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-01-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is meanwhile widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionopheric error (RIE) can still be significant so that it needs to be further mitigated for high accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important towards enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and SDs, with solar activity, latitudinal region, and with or without the assumption of ionospheric spherical symmetry and of co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low- to high-latitudes, have a clear negative tendency and a magnitude increasing with solar activity, in line with recent empirical studies based on real RO data. The maximum RIE biases are found at low latitudes during daytime, where they amount to with in -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions). Ionospheric spherical symmetry or asymmetries about the RO event location have only a minor influence on

  3. Quantifying residual ionospheric errors in GNSS radio occultation bending angles based on ensembles of profiles from end-to-end simulations

    NASA Astrophysics Data System (ADS)

    Liu, C. L.; Kirchengast, G.; Zhang, K.; Norman, R.; Li, Y.; Zhang, S. C.; Fritzer, J.; Schwaerz, M.; Wu, S. Q.; Tan, Z. X.

    2015-07-01

    The radio occultation (RO) technique using signals from the Global Navigation Satellite System (GNSS), in particular from the Global Positioning System (GPS) so far, is currently widely used to observe the atmosphere for applications such as numerical weather prediction and global climate monitoring. The ionosphere is a major error source in RO measurements at stratospheric altitudes, and a linear ionospheric correction of dual-frequency RO bending angles is commonly used to remove the first-order ionospheric effect. However, the residual ionospheric error (RIE) can still be significant so that it needs to be further mitigated for high-accuracy applications, especially above about 30 km altitude where the RIE is most relevant compared to the magnitude of the neutral atmospheric bending angle. Quantification and careful analyses for better understanding of the RIE is therefore important for enabling benchmark-quality stratospheric RO retrievals. Here we present such an analysis of bending angle RIEs covering the stratosphere and mesosphere, using quasi-realistic end-to-end simulations for a full-day ensemble of RO events. Based on the ensemble simulations we assessed the variation of bending angle RIEs, both biases and standard deviations, with solar activity, latitudinal region and with or without the assumption of ionospheric spherical symmetry and co-existing observing system errors. We find that the bending angle RIE biases in the upper stratosphere and mesosphere, and in all latitudinal zones from low to high latitudes, have a clear negative tendency and a magnitude increasing with solar activity, which is in line with recent empirical studies based on real RO data although we find smaller bias magnitudes, deserving further study in the future. The maximum RIE biases are found at low latitudes during daytime, where they amount to within -0.03 to -0.05 μrad, the smallest at high latitudes (0 to -0.01 μrad; quiet space weather and winter conditions

  4. On minimally parametric primordial power spectrum reconstruction and the evidence for a red tilt

    SciTech Connect

    Verde, Licia; Peiris, Hiranya E-mail: lverde@astro.princeton.edu

    2008-07-15

    The latest cosmological data seem to indicate a significant deviation from scale invariance of the primordial power spectrum when parameterized either by a power law or by a spectral index with non-zero 'running'. This deviation, by itself, serves as a powerful tool for discriminating among theories for the origin of cosmological structures such as inflationary models. Here, we use a minimally parametric smoothing spline technique to reconstruct the shape of the primordial power spectrum. This technique is well suited to searching for smooth features in the primordial power spectrum such as deviations from scale invariance or a running spectral index, although it would recover sharp features of high statistical significance. We use the WMAP three-year results in combination with data from a suite of higher resolution cosmic microwave background experiments (including the latest ACBAR 2008 release), as well as large-scale structure data from SDSS and 2dFGRS. We employ cross-validation to assess, using the data themselves, the optimal amount of smoothness in the primordial power spectrum consistent with the data. This minimally parametric reconstruction supports the evidence for a power law primordial power spectrum with a red tilt, but not for deviations from a power law power spectrum. Smooth variations in the primordial power spectrum are not significantly degenerate with the other cosmological parameters.

  5. Post-operative imaging of anterior cruciate ligament reconstruction techniques across the spectrum of skeletal maturity.

    PubMed

    Zbojniewicz, Andrew M; Meyers, Arthur B; Wall, Eric J

    2016-04-01

    Due to an increased frequency of anterior cruciate ligament (ACL) injuries in young patients and improved outcomes in athletic performance following ACL reconstruction, surgery is increasingly being performed across the spectrum of skeletal maturity. We present a review of the range of reconstruction techniques performed in skeletally immature patients (physeal sparing techniques, which may involve epiphyseal tunnels or the utilization of an iliotibial band autograft), those performed in patients nearing skeletal maturity (transphyseal techniques), and the more conventional ACL reconstruction techniques performed in skeletally mature adolescents. It is important that radiologists be aware of the range of techniques being performed throughout the spectrum of skeletal maturity in order to accurately characterize the expected post-operative appearance as well as to identify complications, including those unique to this younger population. PMID:26646675

  6. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S

  7. Micro-ARES, an electric-field sensor for ExoMars 2016: Electric fields modelling, sensitivity evaluations and end-to-end tests.

    NASA Astrophysics Data System (ADS)

    Déprez, Grégoire; Montmessin, Franck; Witasse, Olivier; Lapauw, Laurent; Vivat, Francis; Abbaki, Sadok; Granier, Philippe; Moirin, David; Trautner, Roland; Hassen-Khodja, Rafik; d'Almeida, Éric; Chardenal, Laurent; Berthelier, Jean-Jacques; Esposito, Francesca; Debei, Stefano; Rafkin, Scott; Barth, Erika

    2014-05-01

    Earth and transposed to the Martian atmospheric parameters. Knowing the expected electric fields and simulating them, the next step in order to evaluate the performance of the instrument is to determine its sensitivity by modelling the response of the instrument. The last step is to confront the model of the instrument, and the expected results for a given signal with the effective outputs of the electric board with the same signal as an input. To achieve this end-to-end test, we use a signal generator followed by an electrical circuit reproducing the electrode behaviour in the Martian environment, in order to inject a realistic electric signal in the processing board and finally compare the produced formatted data with the expected ones.

  8. Reconstruction of absolute absorption spectrum of reduced heme a in cytochrome C oxidase from bovine heart.

    PubMed

    Dyuba, A V; Vygodina, T V; Konstantinov, A A

    2013-12-01

    This paper presents a new experimental approach for determining the individual optical characteristics of reduced heme a in bovine heart cytochrome c oxidase starting from a small selective shift of the heme a absorption spectrum induced by calcium ions. The difference spectrum induced by Ca2+ corresponds actually to a first derivative (differential) of the heme a(2+) absolute absorption spectrum. Such an absolute spectrum was obtained for the mixed-valence cyanide complex of cytochrome oxidase (a(2+)a3(3+)-CN) and was subsequently used as a basis spectrum for further procession and modeling. The individual absorption spectrum of the reduced heme a in the Soret region was reconstructed as the integral of the difference spectrum induced by addition of Ca2+. The spectrum of heme a(2+) in the Soret region obtained in this way is characterized by a peak with a maximum at 447 nm and half-width of 17 nm and can be decomposed into two Gaussians with maxima at 442 and 451 nm and half-widths of ~10 nm (589 cm(-1)) corresponding to the perpendicularly oriented electronic π→π* transitions B0x and B0y in the porphyrin ring. The reconstructed spectrum in the Soret band differs significantly from the "classical" absorption spectrum of heme a(2+) originally described by Vanneste (Vanneste, W. H. (1966) Biochemistry, 65, 838-848). The differences indicate that the overall γ-band of heme a(2+) in cytochrome oxidase contains in addition to the B0x and B0y transitions extra components that are not sensitive to calcium ions, or, alternatively, that the Vanneste's spectrum of heme a(2+) contains significant contribution from heme a3(2+). The reconstructed absorption band of heme a(2+) in the α-band with maximum at 605 nm and half-width of 18 nm (850 cm(-1)) corresponds most likely to the individual Q0y transition of heme a, whereas the Q0x transition contributes only weakly to the spectrum.

  9. Reconstruction of the Galactic Cosmic ray Spectrum during the June 2015 Forbush decrease

    NASA Astrophysics Data System (ADS)

    Santiago, Alberto; Lara, Alejandro

    2016-07-01

    During the final week of June 2015, a Forbush decrease was observed by different detectors around the world. We have used data from Neutron Monitors of the North-west University, South Africa, and a force free model to reconstruct the variations of the primary spectrum of the galactic cosmic rays during this event. The validity of our method was confirmed by convolving the computed spectrum with the appropriate Yield function, to reproduce the observations of different Neutron Monitors reported in the Neutron Monitor database. We found a very good agreement between the observed and computed data. We present the method, the results and its possible application to different cosmic ray detectors.

  10. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM

  11. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW

  12. [Nonsuture microvascular anastomosis--experimental arterial end-to-end anastomosis using plastic adhesive and a soluble PVA tube (author's transl)].

    PubMed

    Yamagata, S; Handa, H; Taki, W; Yonekawa, Y; Ikada, Y; Iwata, H

    1979-11-01

    Microvascular anastomosis is now widely applied and many improved methods of the nonsuture anastomosis have been developed instead of the suture anastomosis for the purpose of saving time and making the reconstruction easier. We introduced a new nonsuture method of microvascular anastomosis using plastic adhesive and a soluble tube made of polyvinylalcohol (PVA). PVA, which had been utilized as a plasma expander, is a water-soluble polymer and its solubility is changeable depending on the degree of polimerization and percent saponification. We have made two kinds of soluble PVA tubes, the one has monolayer wall and the other double layered wall. The inner wall of the latter is more soluble than the outer wall. As plastic adhesives, we employed ethyl 2--cyanoacrylate, isopropyl 2--cyanoacrylate, and isobutyl 2--cyanoacrylate which were much superior to methyl 2--cyanoacrylate. Common carotid arteries of rats of 1.0 to 1.3 mm external diameter range were reconstructed and re-exploration was carried out at intervals of more than 7 days after operation. The anastomotic technique was very easy and it took about five minutes to reconstruct. In our last series, approximately 98 percent patency rate was achieved. The advantage of our method is that the blood stream is regained in the small soluble at the anastomotic site immediately after the release of hemostatic clamps.

  13. SU-E-T-109: Development of An End-To-End Test for the Varian TrueBeamtm with a Novel Multiple-Dosimetric Modality H and N Phantom

    SciTech Connect

    Zakjevskii, V; Knill, C; Rakowski, J; Snyder, M

    2014-06-01

    Purpose: To develop a comprehensive end-to-end test for Varian's TrueBeam linear accelerator for head and neck IMRT using a custom phantom designed to utilize multiple dosimetry devices. Methods: The initial end-to-end test and custom H and N phantom were designed to yield maximum information in anatomical regions significant to H and N plans with respect to: i) geometric accuracy, ii) dosimetric accuracy, and iii) treatment reproducibility. The phantom was designed in collaboration with Integrated Medical Technologies. A CT image was taken with a 1mm slice thickness. The CT was imported into Varian's Eclipse treatment planning system, where OARs and the PTV were contoured. A clinical template was used to create an eight field static gantry angle IMRT plan. After optimization, dose was calculated using the Analytic Anisotropic Algorithm with inhomogeneity correction. Plans were delivered with a TrueBeam equipped with a high definition MLC. Preliminary end-to-end results were measured using film and ion chambers. Ion chamber dose measurements were compared to the TPS. Films were analyzed with FilmQAPro using composite gamma index. Results: Film analysis for the initial end-to-end plan with a geometrically simple PTV showed average gamma pass rates >99% with a passing criterion of 3% / 3mm. Film analysis of a plan with a more realistic, ie. complex, PTV yielded pass rates >99% in clinically important regions containing the PTV, spinal cord and parotid glands. Ion chamber measurements were on average within 1.21% of calculated dose for both plans. Conclusion: trials have demonstrated that our end-to-end testing methods provide baseline values for the dosimetric and geometric accuracy of Varian's TrueBeam system.

  14. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    NASA Astrophysics Data System (ADS)

    Gu, Renliang; Dogandžić, Aleksandar

    2015-03-01

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of the density map image in the wavelet domain. This algorithm alternates between a Nesterov's proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.

  15. Polychromatic sparse image reconstruction and mass attenuation spectrum estimation via B-spline basis function expansion

    SciTech Connect

    Gu, Renliang E-mail: ald@iastate.edu; Dogandžić, Aleksandar E-mail: ald@iastate.edu

    2015-03-31

    We develop a sparse image reconstruction method for polychromatic computed tomography (CT) measurements under the blind scenario where the material of the inspected object and the incident energy spectrum are unknown. To obtain a parsimonious measurement model parameterization, we first rewrite the measurement equation using our mass-attenuation parameterization, which has the Laplace integral form. The unknown mass-attenuation spectrum is expanded into basis functions using a B-spline basis of order one. We develop a block coordinate-descent algorithm for constrained minimization of a penalized negative log-likelihood function, where constraints and penalty terms ensure nonnegativity of the spline coefficients and sparsity of the density map image in the wavelet domain. This algorithm alternates between a Nesterov’s proximal-gradient step for estimating the density map image and an active-set step for estimating the incident spectrum parameters. Numerical simulations demonstrate the performance of the proposed scheme.

  16. A new method of reconstructing VHE γ-ray spectra: the Template Background Spectrum

    NASA Astrophysics Data System (ADS)

    Fernandes, Milton Virgílio; Horns, Dieter; Kosack, Karl; Raue, Martin; Rowell, Gavin

    2014-08-01

    Context. Very-high-energy (VHE, E > 0.1 TeV) γ-ray emission regions with angular extents comparable to the field-of-view of current imaging air-Cherenkov telescopes (IACT) require additional observations of source-free regions to estimate the background contribution to the energy spectrum. This reduces the effective observation time and deteriorates the sensitivity. Aims: A new method of reconstructing spectra from IACT data without the need of additional observations of source-free regions is developed. Its application is not restricted to any specific IACT or data format. Methods: On the basis of the template background method, which defines the background in air-shower parameter space, a new spectral reconstruction method from IACT data is developed and studied, the Template Background Spectrum (TBS); TBS is tested on published H.E.S.S. data and H.E.S.S. results. Results: Good agreement is found between VHE γ-ray spectra reported by the H.E.S.S. collaboration and those re-analysed with TBS. This includes analyses of point-like sources, sources in crowded regions, and of very extended sources down to sources with fluxes of a few percent of the Crab nebula flux and excess-to-background ratios around 0.1. However, the TBS background normalisation introduces new statistical and systematic errors which are accounted for, but may constitute a limiting case for very faint extended sources. Conclusions: The TBS method enables the spectral reconstruction of data when other methods are hampered or even fail. It does not need dedicated observations of VHE γ-ray-free regions (e.g. as the On/Off background does) and circumvents known geometrical limitations to which other methods (e.g. the reflected-region background) for reconstructing spectral information of VHE γ-ray emission regions are prone to; TBS would be, in specific cases, the only feasible way to reconstruct energy spectra.

  17. Red, Straight, no bends: primordial power spectrum reconstruction from CMB and large-scale structure

    NASA Astrophysics Data System (ADS)

    Ravenni, Andrea; Verde, Licia; Cuesta, Antonio J.

    2016-08-01

    We present a minimally parametric, model independent reconstruction of the shape of the primordial power spectrum. Our smoothing spline technique is well-suited to search for smooth features such as deviations from scale invariance, and deviations from a power law such as running of the spectral index or small-scale power suppression. We use a comprehensive set of the state-of the art cosmological data: Planck observations of the temperature and polarisation anisotropies of the cosmic microwave background, WiggleZ and Sloan Digital Sky Survey Data Release 7 galaxy power spectra and the Canada-France-Hawaii Lensing Survey correlation function. This reconstruction strongly supports the evidence for a power law primordial power spectrum with a red tilt and disfavours deviations from a power law power spectrum including small-scale power suppression such as that induced by significantly massive neutrinos. This offers a powerful confirmation of the inflationary paradigm, justifying the adoption of the inflationary prior in cosmological analyses.

  18. Reconstruction of a energy wave spectrum using a non-intrusive technique

    NASA Astrophysics Data System (ADS)

    Vargas, Diana; Lugo, Adolfo; Mendoza, Edgar; Silva, Rodolfo

    2014-11-01

    For studies taken in a wave flume, it is frequent to use wave gauges to measure directly the free surface fluctuations. Sometimes these gauges can interfere the measures because this probes act as obstacles to water. Therefore we designed a non intrusive technique using a bubble curtain. In this work we pretend to reconstruct the energy wave spectrum of regular and irregular waves, generated in a wave flume, assuming linear and non linear wave theory by analyzing the time series of the bubbles velocity field given with the aid of PIV.

  19. Reconstruction of a nonminimal coupling theory with scale-invariant power spectrum

    SciTech Connect

    Qiu, Taotao

    2012-06-01

    A nonminimal coupling single scalar field theory, when transformed from Jordan frame to Einstein frame, can act like a minimal coupling one. Making use of this property, we investigate how a nonminimal coupling theory with scale-invariant power spectrum could be reconstructed from its minimal coupling counterpart, which can be applied in the early universe. Thanks to the coupling to gravity, the equation of state of our universe for a scale-invariant power spectrum can be relaxed, and the relation between the parameters in the action can be obtained. This approach also provides a means to address the Big-Bang puzzles and anisotropy problem in the nonminimal coupling model within Jordan frame. Due to the equivalence between the two frames, one may be able to find models that are free of the horizon, flatness, singularity as well as anisotropy problems.

  20. Reconstruction of broad features in the primordial spectrum and inflaton potential from Planck

    SciTech Connect

    Hazra, Dhiraj Kumar; Shafieloo, Arman; Smoot, George F. E-mail: arman@apctp.org

    2013-12-01

    With the recently published Cosmic Microwave Background data from Planck we address the optimized binning of the primordial power spectrum. As an important modification to the usual binning of the primordial spectrum, along with the spectral amplitude of the bins, we allow the position of the bins also to vary. This technique enables us to address the location of the possible broad physical features in the primordial spectrum with relatively smaller number of bins compared to the analysis performed earlier. This approach is in fact a reconstruction method looking for broad features in the primordial spectrum and avoiding fitting noise in the data. Performing Markov Chain Monte Carlo analysis we present samples of the allowed primordial spectra with broad features consistent with Planck data. To test how realistic it is to have step-like features in primordial spectrum we revisit an inflationary model, proposed by A. A. Starobinsky which can address the similar features obtained from the binning of the spectrum. Using the publicly available code BINGO, we numerically calculate the local f{sub NL} for this model in equilateral and arbitrary triangular configurations of wavevectors and show that the obtained non-Gaussianity for this model is consistent with Planck results. In this paper we have also considered different spectral tilts at different bins to identify the cosmological scale that the spectral index needs to have a red tilt and it is interesting to report that spectral index cannot be well constrained up to k ≈ 0.01Mpc{sup −1}.

  1. [A Method to Reconstruct Surface Reflectance Spectrum from Multispectral Image Based on Canopy Radiation Transfer Model].

    PubMed

    Zhao, Yong-guang; Ma, Ling-ling; Li, Chuan-rong; Zhu, Xiao-hua; Tang, Ling-li

    2015-07-01

    Due to the lack of enough spectral bands for multi-spectral sensor, it is difficult to reconstruct surface retlectance spectrum from finite spectral information acquired by multi-spectral instrument. Here, taking into full account of the heterogeneity of pixel from remote sensing image, a method is proposed to simulate hyperspectral data from multispectral data based on canopy radiation transfer model. This method first assumes the mixed pixels contain two types of land cover, i.e., vegetation and soil. The sensitive parameters of Soil-Leaf-Canopy (SLC) model and a soil ratio factor were retrieved from multi-spectral data based on Look-Up Table (LUT) technology. Then, by combined with a soil ratio factor, all the parameters were input into the SLC model to simulate the surface reflectance spectrum from 400 to 2 400 nm. Taking Landsat Enhanced Thematic Mapper Plus (ETM+) image as reference image, the surface reflectance spectrum was simulated. The simulated reflectance spectrum revealed different feature information of different surface types. To test the performance of this method, the simulated reflectance spectrum was convolved with the Landsat ETM + spectral response curves and Moderate Resolution Imaging Spectrometer (MODIS) spectral response curves to obtain the simulated Landsat ETM+ and MODIS image. Finally, the simulated Landsat ETM+ and MODIS images were compared with the observed Landsat ETM+ and MODIS images. The results generally showed high correction coefficients (Landsat: 0.90-0.99, MODIS: 0.74-0.85) between most simulated bands and observed bands and indicated that the simulated reflectance spectrum was well simulated and reliable. PMID:26717721

  2. Reconstruction of a broadband spectrum of Alfvénic fluctuations

    SciTech Connect

    Viñas, Adolfo F; Moya, Pablo S.; Maneva, Yana G.; Araneda, Jaime A.

    2014-05-10

    Alfvénic fluctuations in the solar wind exhibit a high degree of velocities and magnetic field correlations consistent with Alfvén waves propagating away and toward the Sun. Two remarkable properties of these fluctuations are the tendencies to have either positive or negative magnetic helicity (–1 ≤ σ {sub m} ≤ +1) associated with either left- or right- topological handedness of the fluctuations and to have a constant magnetic field magnitude. This paper provides, for the first time, a theoretical framework for reconstructing both the magnetic and velocity field fluctuations with a divergence-free magnetic field, with any specified power spectral index and normalized magnetic- and cross-helicity spectrum field fluctuations for any plasma species. The spectrum is constructed in the Fourier domain by imposing two conditions—a divergence-free magnetic field and the preservation of the sense of magnetic helicity in both spaces—as well as using Parseval's theorem for the conservation of energy between configuration and Fourier spaces. Applications to the one-dimensional spatial Alfvénic propagation are presented. The theoretical construction is in agreement with typical time series and power spectra properties observed in the solar wind. The theoretical ideas presented in this spectral reconstruction provide a foundation for more realistic simulations of plasma waves, solar wind turbulence, and the propagation of energetic particles in such fluctuating fields.

  3. [Research on an Equal Wavelength Spectrum Reconstruction Method of Interference Imaging Spectrometer].

    PubMed

    Xie, Pei-yue; Yang, Jian-feng; Xue, Bin; Lü, Juan; He, Ying-hong; Li, Ting; Ma, Xiao-long

    2016-03-01

    Interference imaging spectrometer is one of the most important equipments of Chang'E 1 satellite, which is applied to analysis the material composition and its distribution of the surface on the moon. At present, the spectral resolution of level 2B scientific data obtained by existing methods is 325 cm(-1). If we use the description way of wavelength resolution, various spectrum is different: the first band is 7.6 nm, the last band is 29 nm, which introduces two questions: (1) the spectral resolution description way mismatch with the way of ground spectral library used for calibration and comparison; (2) The signal-to-noise ratio of the spectra in the shortwave band is low due to the signal entering narrow band is little. This paper discussed the relationship between wavelength resolution and cut-off function based on the reconstruction model of CE-1 interference imaging spectrometer. It proposed an adjustable cut-off function changing with wavelength or wavelength resolution, while selected the appropriate Sinc function as apodization to realize the reconstruction of arbitrary specified wavelength resolution in the band coverage. Then we used this method to CE-1 on orbit 0B data to get a spectral image of 29 nm wavelength resolution. Finally, by using the signal-to-noise ratio, principal component analysis and unsupervised classification method on the reconstruction results with 2 grade science data from ground application system for comparison, the results showed that: signal-to-noise ratio of the shortwave band increased about 4 times, and the average increased about 2.4 times, the classification based on the spectrum was consistent, and the quality of the data was greatly improved. So, EWSR method has the advantages that: (1) in the case of keeping spectral information steadiness, it can improve the signal-to-noise ratio of shortwave band spectrum though sacrificed part of spectral resolution; (2) it can achieve the spectral data reconstruction which can set

  4. [Research on an Equal Wavelength Spectrum Reconstruction Method of Interference Imaging Spectrometer].

    PubMed

    Xie, Pei-yue; Yang, Jian-feng; Xue, Bin; Lü, Juan; He, Ying-hong; Li, Ting; Ma, Xiao-long

    2016-03-01

    Interference imaging spectrometer is one of the most important equipments of Chang'E 1 satellite, which is applied to analysis the material composition and its distribution of the surface on the moon. At present, the spectral resolution of level 2B scientific data obtained by existing methods is 325 cm(-1). If we use the description way of wavelength resolution, various spectrum is different: the first band is 7.6 nm, the last band is 29 nm, which introduces two questions: (1) the spectral resolution description way mismatch with the way of ground spectral library used for calibration and comparison; (2) The signal-to-noise ratio of the spectra in the shortwave band is low due to the signal entering narrow band is little. This paper discussed the relationship between wavelength resolution and cut-off function based on the reconstruction model of CE-1 interference imaging spectrometer. It proposed an adjustable cut-off function changing with wavelength or wavelength resolution, while selected the appropriate Sinc function as apodization to realize the reconstruction of arbitrary specified wavelength resolution in the band coverage. Then we used this method to CE-1 on orbit 0B data to get a spectral image of 29 nm wavelength resolution. Finally, by using the signal-to-noise ratio, principal component analysis and unsupervised classification method on the reconstruction results with 2 grade science data from ground application system for comparison, the results showed that: signal-to-noise ratio of the shortwave band increased about 4 times, and the average increased about 2.4 times, the classification based on the spectrum was consistent, and the quality of the data was greatly improved. So, EWSR method has the advantages that: (1) in the case of keeping spectral information steadiness, it can improve the signal-to-noise ratio of shortwave band spectrum though sacrificed part of spectral resolution; (2) it can achieve the spectral data reconstruction which can set

  5. Reconstruction of the primordial power spectrum of curvature perturbations using multiple data sets

    NASA Astrophysics Data System (ADS)

    Hunt, Paul; Sarkar, Subir

    2014-01-01

    Detailed knowledge of the primordial power spectrum of curvature perturbations is essential both in order to elucidate the physical mechanism (`inflation') which generated it, and for estimating the cosmological parameters from observations of the cosmic microwave background and large-scale structure. Hence it ought to be extracted from such data in a model-independent manner, however this is difficult because relevant cosmological observables are given by a convolution of the primordial perturbations with some smoothing kernel which depends on both the assumed world model and the matter content of the universe. Moreover the deconvolution problem is ill-conditioned so a regularisation scheme must be employed to control error propagation. We demonstrate that `Tikhonov regularisation' can robustly reconstruct the primordial spectrum from multiple cosmological data sets, a significant advantage being that both its uncertainty and resolution are then quantified. Using Monte Carlo simulations we investigate several regularisation parameter selection methods and find that generalised cross-validation and Mallow's Cp method give optimal results. We apply our inversion procedure to data from the Wilkinson Microwave Anisotropy Probe, other ground-based small angular scale CMB experiments, and the Sloan Digital Sky Survey. The reconstructed spectrum (assuming the standard ΛCDM cosmology) is not scale-free but has an infrared cutoff at klesssim5 × 10-4 Mpc-1 (due to the anomalously low CMB quadrupole) and several features with ~ 2σ significance at k/Mpc-1 ~ 0.0013-0.0025, 0.0362-0.0402 and 0.051-0.056, reflecting the `WMAP glitches'. To test whether these are indeed real will require more accurate data, such as from the Planck satellite and new ground-based experiments.

  6. Reconstruction of the primordial power spectrum of curvature perturbations using multiple data sets

    SciTech Connect

    Hunt, Paul; Sarkar, Subir E-mail: s.sarkar@physics.ox.ac.uk

    2014-01-01

    Detailed knowledge of the primordial power spectrum of curvature perturbations is essential both in order to elucidate the physical mechanism ('inflation') which generated it, and for estimating the cosmological parameters from observations of the cosmic microwave background and large-scale structure. Hence it ought to be extracted from such data in a model-independent manner, however this is difficult because relevant cosmological observables are given by a convolution of the primordial perturbations with some smoothing kernel which depends on both the assumed world model and the matter content of the universe. Moreover the deconvolution problem is ill-conditioned so a regularisation scheme must be employed to control error propagation. We demonstrate that 'Tikhonov regularisation' can robustly reconstruct the primordial spectrum from multiple cosmological data sets, a significant advantage being that both its uncertainty and resolution are then quantified. Using Monte Carlo simulations we investigate several regularisation parameter selection methods and find that generalised cross-validation and Mallow's C{sub p} method give optimal results. We apply our inversion procedure to data from the Wilkinson Microwave Anisotropy Probe, other ground-based small angular scale CMB experiments, and the Sloan Digital Sky Survey. The reconstructed spectrum (assuming the standard ΛCDM cosmology) is not scale-free but has an infrared cutoff at k∼<5 × 10{sup −4} Mpc{sup −1} (due to the anomalously low CMB quadrupole) and several features with ∼ 2σ significance at k/Mpc{sup −1} ∼ 0.0013–0.0025, 0.0362–0.0402 and 0.051–0.056, reflecting the 'WMAP glitches'. To test whether these are indeed real will require more accurate data, such as from the Planck satellite and new ground-based experiments.

  7. Demonstration of end-to-end cloud-DSL with a PON-based fronthaul supporting 5.76-Gb/s throughput with 48 eCDMA-encoded 1024-QAM discrete multi-tone signals.

    PubMed

    Fang, Liming; Zhou, Lei; Liu, Xiang; Zhang, Xiaofeng; Sui, Meng; Effenberger, Frank; Zhou, Jun

    2015-05-18

    We experimentally demonstrate an end-to-end ultra-broadband cloud-DSL network using passive optical network (PON) based fronthaul with electronic code-division-multiple-access (eCDMA) encoding and decoding. Forty-eight signals that are compliant with the very-high-bit-rate digital subscriber line 2 (VDSL2) standard are transmitted with a record throughput of 5.76 Gb/s over a hybrid link consisting of a 20-km standard single-mode fiber and a 100-m twisted pair.

  8. End-to-end military pain management

    PubMed Central

    Aldington, D. J.; McQuay, H. J.; Moore, R. A.

    2011-01-01

    The last three years have seen significant changes in the Defence Medical Services approach to trauma pain management. This article seeks to outline these changes that have occurred at every level of the casualty's journey along the chain of evacuation, from the point of injury to rehabilitation and either continued employment in the Services or to medical discharge. Particular attention is paid to the evidence for the interventions used for both acute pain and chronic pain management. Also highlighted are possible differences in pain management techniques between civilian and military casualties. PMID:21149362

  9. End-to-end image quality assessment

    NASA Astrophysics Data System (ADS)

    Raventos, Joaquin

    2012-05-01

    An innovative computerized benchmarking approach (US Patent pending Sep 2011) based on extensive application of photometry, geometrical optics, and digital media using a randomized target, for a standard observer to assess the image quality of video imaging systems, at different day time, and low-light luminance levels. It takes into account, the target's contrast and color characteristics, as well as the observer's visual acuity and dynamic response. This includes human vision as part of the "extended video imaging system" (EVIS), and allows image quality assessment by several standard observers simultaneously.

  10. Pilot End-to-End Calibration Results

    NASA Astrophysics Data System (ADS)

    Misawa, R.; Bernard, J.-Ph.; Ade, P.; Andre, Y.; de Bernardis, P.; Bautista, L.; Boulade, O.; Bousquet, F.; Bouzit, M.; Bray, N.; Brysbaert, C.; Buttice, V.; Caillat, A.; Chaigneau, M.; Charra, M.; Crane, B.; Douchin, F.; Doumayrou, E.; Dubois, J. P.; Engel, C.; Etcheto, P.; Evrard, J.; Gelot, P.; Gomes, A.; Grabarnik, S.; Griffin, M.; Hargrave, P.; Jonathan, A.; Laureijs, R.; Laurens, A.; Lepennec, Y.; Leriche, B.; Longval, Y.; Martignac, J.; Marty, C.; Marty, W.; Maestre, S.; Masi, S.; Mirc, F.; Montel, J.; Motier, L.; Mot, B.; Narbonne, J.; Nicot, J. M.; Otrio, G.; Pajot, F.; Perot, E.; Pisano, G.; Ponthieu, N.; Ristorcelli, I.; Rodriquez, L.; Roudil, G.; Saccoccio, M.; Salatino, M.; Savini, G.; Simonella, O.; Tauber, J.; Tapie, P.; Tucker, C.; Versepuech, G.

    2015-09-01

    The Polarized Instrument for Long-wavelength Observation of the Tenuous interstellar medium (PILOT) is a balloon-borne astronomy experiment designed to study the linear polarization of the Far Infra-Red emission, 240 ~im (1.2 THz) and 550 ~tm (545 GHz) with an angular resolution of a few minutes of arc, from dust grains present in the diffuse interstellar medium, in our Galaxy and nearby galaxies. The polarisation of light is measured using a half-wave plate (HWP). We performed the instrumental tests from 2012 to 2014 and are planning a first scientific flight in September 2015 from Timmins, Ontario, Canada. This paper describes the measurement principles of PILOT, the results of the laboratory tests and its sky coverage. These include defocus tests, transmission measurements using a Fourier Transform Spectrometer at various positions of the HWP, and identification of internal straylight.

  11. A comparative study of red and blue light-emitting diodes and low-level laser in regeneration of the transected sciatic nerve after an end to end neurorrhaphy in rabbits.

    PubMed

    Takhtfooladi, Mohammad Ashrafzadeh; Sharifi, Davood

    2015-12-01

    This study aimed at evaluating the effects of red and blue light-emitting diodes (LED) and low-level laser (LLL) on the regeneration of the transected sciatic nerve after an end-to-end neurorrhaphy in rabbits. Forty healthy mature male New Zealand rabbits were randomly assigned into four experimental groups: control, LLL (680 nm), red LED (650 nm), and blue LED (450 nm). All animals underwent the right sciatic nerve neurotmesis injury under general anesthesia and end-to-end anastomosis. The phototherapy was initiated on the first postoperative day and lasted for 14 consecutive days at the same time of the day. On the 30th day post-surgery, the animals whose sciatic nerves were harvested for histopathological analysis were euthanized. The nerves were analyzed and quantified the following findings: Schwann cells, large myelinic axons, and neurons. In the LLL group, as compared to other groups, an increase in the number of all analyzed aspects was observed with significance level (P < 0.05). This finding suggests that postoperative LLL irradiation was able to accelerate and potentialize the peripheral nerve regeneration process in rabbits within 14 days of irradiation.

  12. Generic inference of inflation models by non-Gaussianity and primordial power spectrum reconstruction

    SciTech Connect

    Dorn, Sebastian; Enßlin, Torsten A.; Ramirez, Erandy; Kunze, Kerstin E.

    2014-06-01

    We present a generic inference method for inflation models from observational data by the usage of higher-order statistics of the curvature perturbation on uniform density hypersurfaces. This method is based on the calculation of the posterior for the primordial non-Gaussianity parameters f{sub NL} and g{sub NL}, which in general depend on specific parameters of inflation and reheating models, and enables to discriminate among the still viable inflation models. To keep analyticity as far as possible to dispense with numerically expensive sampling techniques a saddle-point approximation is introduced, whose precision is validated for a numerical toy example. The mathematical formulation is done in a generic way so that the approach remains applicable to cosmic microwave background data as well as to large scale structure data. Additionally, we review a few currently interesting inflation models and present numerical toy examples thereof in two and three dimensions to demonstrate the efficiency of the higher-order statistics method. A second quantity of interest is the primordial power spectrum. Here, we present two Bayesian methods to infer it from observational data, the so called critical filter and an extension thereof with smoothness prior, both allowing for a non-parametric spectrum reconstruction. These methods are able to reconstruct the spectra of the observed perturbations and the primordial ones of curvature perturbation even in case of non-Gaussianity and partial sky coverage. We argue that observables like T- and B-modes permit to measure both spectra. This also allows to infer the level of non-Gaussianity generated since inflation.

  13. Performance of fluorescence retrieval methods and fluorescence spectrum reconstruction under various sensor spectral configurations

    NASA Astrophysics Data System (ADS)

    Li, Rong; Zhao, Feng

    2015-10-01

    Solar-induced chlorophyll fluorescence is closely related to photosynthesis and can serve as an indicator of plant status. Several methods have been proposed to retrieve fluorescence signal (Fs) either at specific spectral bands or within the whole fluorescence emission region. In this study, we investigated the precision of the fluorescence signal obtained through these methods under various sensor spectral characteristics. Simulated datasets generated by the SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) model with known `true' Fs as well as an experimental dataset are exploited to investigate four commonly used Fs retrieval methods, namely the original Fraunhofer Line Discriminator method (FLD), the 3 bands FLD (3FLD), the improved FLD (iFLD), and the Spectral Fitting Methods (SFMs). Fluorescence Spectrum Reconstruction (FSR) method is also investigated using simulated datasets. The sensor characteristics of spectral resolution (SR) and signal-to-noise ratio (SNR) are taken into account. According to the results, finer SR and SNR both lead to better accuracy. Lowest precision is obtained for the FLD method with strong overestimation. Some improvements are made by the 3FLD method, but it still tends to overestimate. Generally, the iFLD method and the SFMs provide better accuracy. As to FSR, the shape and magnitude of reconstructed Fs are generally consistent with the `true' Fs distributions when fine SR is exploited. With coarser SR, however, though R2 of the retrieved Fs may be high, large bias is likely to be obtained as well.

  14. Efficacy and safety of a NiTi CAR 27 compression ring for end-to-end anastomosis compared with conventional staplers: A real-world analysis in Chinese colorectal cancer patients

    PubMed Central

    Lu, Zhenhai; Peng, Jianhong; Li, Cong; Wang, Fulong; Jiang, Wu; Fan, Wenhua; Lin, Junzhong; Wu, Xiaojun; Wan, Desen; Pan, Zhizhong

    2016-01-01

    OBJECTIVES: This study aimed to evaluate the safety and efficacy of a new nickel-titanium shape memory alloy compression anastomosis ring, NiTi CAR 27, in constructing an anastomosis for colorectal cancer resection compared with conventional staples. METHODS: In total, 234 consecutive patients diagnosed with colorectal cancer receiving sigmoidectomy and anterior resection for end-to-end anastomosis from May 2010 to June 2012 were retrospectively analyzed. The postoperative clinical parameters, postoperative complications and 3-year overall survival in 77 patients using a NiTi CAR 27 compression ring (CAR group) and 157 patients with conventional circular staplers (STA group) were compared. RESULTS: There were no statistically significant differences between the patients in the two groups in terms of general demographics and tumor features. A clinically apparent anastomotic leak occurred in 2 patients (2.6%) in the CAR group and in 5 patients (3.2%) in the STA group (p=0.804). These eight patients received a temporary diverting ileostomy. One patient (1.3%) in the CAR group was diagnosed with anastomotic stricture through an electronic colonoscopy after 3 months postoperatively. The incidence of postoperative intestinal obstruction was comparable between the two groups (p=0.192). With a median follow-up duration of 39.6 months, the 3-year overall survival rate was 83.1% in the CAR group and 89.0% in the STA group (p=0.152). CONCLUSIONS: NiTi CAR 27 is safe and effective for colorectal end-to-end anastomosis. Its use is equivalent to that of the conventional circular staplers. This study suggests that NiTi CAR 27 may be a beneficial alternative in colorectal anastomosis in Chinese colorectal cancer patients. PMID:27276395

  15. A noise power spectrum study of a new model-based iterative reconstruction system: Veo 3.0.

    PubMed

    Li, Guang; Liu, Xinming; Dodge, Cristina T; Jensen, Corey T; Rong, X John

    2016-01-01

    The purpose of this study was to evaluate performance of the third generation of model-based iterative reconstruction (MBIR) system, Veo 3.0, based on noise power spectrum (NPS) analysis with various clinical presets over a wide range of clinically applicable dose levels. A CatPhan 600 surrounded by an oval, fat-equivalent ring to mimic patient size/shape was scanned 10 times at each of six dose levels on a GE HD 750 scanner. NPS analysis was performed on images reconstructed with various Veo 3.0 preset combinations for comparisons of those images reconstructed using Veo 2.0, filtered back projection (FBP) and adaptive statistical iterative reconstruc-tion (ASiR). The new Target Thickness setting resulted in higher noise in thicker axial images. The new Texture Enhancement function achieved a more isotropic noise behavior with less image artifacts. Veo 3.0 provides additional reconstruction options designed to allow the user choice of balance between spatial resolution and image noise, relative to Veo 2.0. Veo 3.0 provides more user selectable options and in general improved isotropic noise behavior in comparison to Veo 2.0. The overall noise reduction performance of both versions of MBIR was improved in comparison to FBP and ASiR, especially at low-dose levels. PMID:27685118

  16. An accurate method for energy spectrum reconstruction of Linac beams based on EPID measurements of scatter radiation

    NASA Astrophysics Data System (ADS)

    Juste, B.; Miró, R.; Verdú, G.; Santos, A.

    2014-06-01

    This work presents a methodology to reconstruct a Linac high energy photon spectrum beam. The method is based on EPID scatter images generated when the incident photon beam impinges onto a plastic block. The distribution of scatter radiation produced by this scattering object placed on the external EPID surface and centered at the beam field size was measured. The scatter distribution was also simulated for a series of monoenergetic identical geometry photon beams. Monte Carlo simulations were used to predict the scattered photons for monoenergetic photon beams at 92 different locations, with 0.5 cm increments and at 8.5 cm from the centre of the scattering material. Measurements were performed with the same geometry using a 6 MeV photon beam produced by the linear accelerator. A system of linear equations was generated to combine the polyenergetic EPID measurements with the monoenergetic simulation results. Regularization techniques were applied to solve the system for the incident photon spectrum. A linear matrix system, A×S=E, was developed to describe the scattering interactions and their relationship to the primary spectrum (S). A is the monoenergetic scatter matrix determined from the Monte Carlo simulations, S is the incident photon spectrum, and E represents the scatter distribution characterized by EPID measurement. Direct matrix inversion methods produce results that are not physically consistent due to errors inherent in the system, therefore Tikhonov regularization methods were applied to address the effects of these errors and to solve the system for obtaining a consistent bremsstrahlung spectrum.

  17. Multi-step damped multichannel singular spectrum analysis for simultaneous reconstruction and denoising of 3D seismic data

    NASA Astrophysics Data System (ADS)

    Zhang, Dong; Chen, Yangkang; Huang, Weilin; Gan, Shuwei

    2016-10-01

    Multichannel singular spectrum analysis (MSSA) is an effective approach for simultaneous seismic data reconstruction and denoising. MSSA utilizes truncated singular value decomposition (TSVD) to decompose the noisy signal into a signal subspace and a noise subspace and weighted projection onto convex sets (POCS)-like method to reconstruct the missing data in the appropriately constructed block Hankel matrix at each frequency slice. However, there still exists some residual noise in signal space due to two major factors: the deficiency of traditional TSVD and the iteratively inserted observed noisy data during the process of weighted POCS like iterations. In this paper, we first further extend the recently proposed damped MSSA (DMSSA) for random noise attenuation, which is more powerful in distinguishing between signal and noise, to simultaneous reconstruction and denoising. Then combined with DMSSA, we propose a multi-step strategy, named multi-step damped MSSA (MS-DMSSA), to efficiently reduce the inserted noise during the POCS like iterations, thus can improve the final performance of simultaneous reconstruction and denoising. Application of the MS-DMSSA approach on 3D synthetic and field seismic data demonstrates a better performance compared with the conventional MSSA approach.

  18. The Dosimetric Importance of Six Degree of Freedom Couch End to End Quality Assurance for SRS/SBRT Treatments when Comparing Intensity Modulated Radiation Therapy to Volumetric Modulated Arc Therapy

    NASA Astrophysics Data System (ADS)

    Ulizio, Vincent Michael

    With the advancement of technology there is an increasing ability for lesions to be treated with higher radiation doses each fraction. This also allows for low fractionated treatments. Because the patient is receiving a higher dose of radiation per fraction and because of the fast dose falloff in these targets there must be extreme accuracy in the delivery. The 6 DOF couch allows for extra rotational corrections and for a more accurate set-up. The movement of the couch needs to be verified to be accurate and because of this, end to end quality assurance tests for the couch have been made. After the set-up is known to be accurate then different treatment techniques can be studied. SBRT of the Spine has a very fast dose falloff near the spinal cord and was typically treated with IMRT. Treatment plans generated using this technique tend to have streaks of low dose radiation, so VMAT is being studied to determine if this treatment technique can reduce the low dose radiation volume as well as improve OAR sparing. For the 6 DOF couch QA, graph paper is placed on the anterior and right lateral sides of the VisionRT OSMS Cube Phantom. Each rotational shift is then applied individually, with a 3 degree shift in the positive and negative directions for pitch and roll. A mark is drawn on the paper to record each shift. A CBCT is then taken of the Cube and known shifts are applied and then an additional CBCT is taken to return the Cube to isocenter. The original IMRT plans for SBRT of the Spine are evaluated and then a plan is made utilizing VMAT. These plans are then compared for low dose radiation, OAR sparing, and conformity. If the original IMRT plan is determined to be an inferior treatment to what is acceptable, then this will be re-planned and compared to the VMAT plan. The 6 DOF couch QA tests have proven to be accurate and reproducible. The average deviations in the 3 degree and -3 degree pitch and roll directions were 0.197, 0.068, 0.091, and 0.110 degrees

  19. Study and Implementation of the End-to-End Data Pipeline for the Virtis Imaging Spectrometer Onbaord Venus Express: "From Science Operations Planning to Data Archiving and Higher Lever Processing"

    NASA Astrophysics Data System (ADS)

    Cardesín Moinelo, Alejandro

    2010-04-01

    This PhD Thesis describes the activities performed during the Research Program undertaken for two years at the Istituto Nazionale di AstroFisica in Rome, Italy, as active member of the VIRTIS Technical and Scientific Team, and one additional year at the European Space Astronomy Center in Madrid, Spain, as member of the Mars Express Science Ground Segment. This document will show a study of all sections of the Science Ground Segment of the Venus Express mission, from the planning of the scientific operations, to the generation, calibration and archiving of the science data, including the production of valuable high level products. We will present and discuss here the end-to-end diagram of the ground segment from the technical and scientific point of view, in order to describe the overall flow of information: from the original scientific requests of the principal investigator and interdisciplinary teams, up to the spacecraft, and down again for the analysis of the measurements and interpretation of the scientific results. These scientific results drive to new and more elaborated scientific requests, which are used as feedback to the planning cycle, closing the circle. Special attention is given here to describe the implementation and development of the data pipeline for the VIRTIS instrument onboard Venus Express. During the research program, both the raw data generation pipeline and the data calibration pipeline were developed and automated in order to produce the final raw and calibrated data products from the input telemetry of the instrument. The final raw and calibrated products presented in this work are currently being used by the VIRTIS Science team for data analysis and are distributed to the whole scientific community via the Planetary Science Archive. More than 20,000 raw data files and 10,000 calibrated products have already been generated after almost 4 years of mission. In the final part of the Thesis, we will also present some high level data

  20. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    SciTech Connect

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak shear

  1. Reconstruction of the energy spectrum of electrons accelerated in the April 15, 2002 solar flare based on IRIS X-ray spectrometer measurements

    NASA Astrophysics Data System (ADS)

    Motorina, G. G.; Kudryavtsev, I. V.; Lazutkov, V. P.; Savchenko, M. I.; Skorodumov, D. V.; Charikov, Yu. E.

    2016-04-01

    We reconstruct the energy distribution of electrons accelerated in the April 15, 2002 solar flare on the basis of the data from the IRIS X-ray spectrometer onboard the CORONAS-F satellite. We obtain the solution to the integral equations describing the transformation of the spectrum of X-ray photons during the recording and reconstruction of the spectrum of accelerated electrons in the bremsstrahlung source using the random search method and the Tikhonov regularization method. In this event, we detected a singularity in the electron spectrum associated with the existence of a local minimum in the energy range 40-60 keV, which cannot be detected by a direct method.

  2. Pancreatectomy with vein reconstruction: technique matters

    PubMed Central

    Dua, Monica M; Tran, Thuy B; Klausner, Jill; Hwa, Kim J; Poultsides, George A; Norton, Jeffrey A; Visser, Brendan C

    2015-01-01

    Background A variety of techniques have been described for portal vein (PV) and/or superior mesenteric vein (SMV) resection/reconstruction during a pancreatectomy. The ideal strategy remains unclear. Methods Patients who underwent PV/SMV resection/reconstruction during a pancreatectomy from 2005 to 2014 were identified. Medical records and imaging were retrospectively reviewed for operative details and outcomes, with particular emphasis on patency. Results Ninety patients underwent vein resection/reconstruction with one of five techniques: (i) longitudinal venorrhaphy (LV, n = 17); (ii) transverse venorrhaphy (TV, n = 9); (iii) primary end-to-end (n = 28); (iv) patch venoplasty (PV, n = 17); and (v) interposition graft (IG, n = 19). With a median follow-up of 316 days, thrombosis was observed in 16/90 (18%). The rate of thrombosis varied according to technique. All patients with primary end-to-end or TV remained patent. LV, PV and IG were all associated with significant rates of thrombosis (P = 0.001 versus no thrombosis). Comparing thrombosed to patent, there were no differences with respect to pancreatectomy type, pre-operative knowledge of vein involvement and neoadjuvant therapy. Prophylactic aspirin was used in 69% of the total cohort (66% of patent, 81% of thrombosed) and showed no protective benefit. Conclusions Primary end-to-end and TV have superior patency than the alternatives after PV/SMV resection and should be the preferred techniques for short (<3 cm) reconstructions. PMID:26223388

  3. End-to-End Performance Management for Large Distributed Storage

    SciTech Connect

    Almadena Chtchelkanova

    2012-03-18

    Storage systems for large distributed clusters of computer servers are themselves large and distributed. Their complexity and scale make it hard to ensure that applications using them get good, predictable performance. At the same time, shared access to the system from multiple applications, users, and internal system activities leads to a need for predictable performance. This research investigates mechanisms for improving storage system performance in large distributed storage systems through mechanisms that integrate the performance aspects of the path that I/O operations take through the system, from the application interface on the compute server, through the network, to the storate servers. The research focuses on five parts of the I/O path in a distributed storage system: I/O scheduling at the storage server, storage server cache management, client-to-server network flow control, client-to-server connection management, and client cache management.

  4. End-to-end experiment management in HPC

    SciTech Connect

    Bent, John M; Kroiss, Ryan R; Torrez, Alfred; Wingate, Meghan

    2010-01-01

    Experiment management in any domain is challenging. There is a perpetual feedback loop cycling through planning, execution, measurement, and analysis. The lifetime of a particular experiment can be limited to a single cycle although many require myriad more cycles before definite results can be obtained. Within each cycle, a large number of subexperiments may be executed in order to measure the effects of one or more independent variables. Experiment management in high performance computing (HPC) follows this general pattern but also has three unique characteristics. One, computational science applications running on large supercomputers must deal with frequent platform failures which can interrupt, perturb, or terminate running experiments. Two, these applications typically integrate in parallel using MPI as their communication medium. Three, there is typically a scheduling system (e.g. Condor, Moab, SGE, etc.) acting as a gate-keeper for the HPC resources. In this paper, we introduce LANL Experiment Management (LEM), an experimental management framework simplifying all four phases of experiment management. LEM simplifies experiment planning by allowing the user to describe their experimental goals without having to fully construct the individual parameters for each task. To simplify execution, LEM dispatches the subexperiments itself thereby freeing the user from remembering the often arcane methods for interacting with the various scheduling systems. LEM provides transducers for experiments that automatically measure and record important information about each subexperiment; these transducers can easily be extended to collect additional measurements specific to each experiment. Finally, experiment analysis is simplified by providing a general database visualization framework that allows users to quickly and easily interact with their measured data.

  5. On Estimating End-to-End Network Path Properties

    NASA Technical Reports Server (NTRS)

    Allman, Mark; Paxson, Vern

    1999-01-01

    The more information about current network conditions available to a transport protocol, the more efficiently it can use the network to transfer its data. In networks such as the Internet, the transport protocol must often form its own estimates of network properties based on measurements per-formed by the connection endpoints. We consider two basic transport estimation problems: determining the setting of the retransmission timer (RTO) for are reliable protocol, and estimating the bandwidth available to a connection as it begins. We look at both of these problems in the context of TCP, using a large TCP measurement set [Pax97b] for trace-driven simulations. For RTO estimation, we evaluate a number of different algorithms, finding that the performance of the estimators is dominated by their minimum values, and to a lesser extent, the timer granularity, while being virtually unaffected by how often round-trip time measurements are made or the settings of the parameters in the exponentially-weighted moving average estimators commonly used. For bandwidth estimation, we explore techniques previously sketched in the literature [Hoe96, AD98] and find that in practice they perform less well than anticipated. We then develop a receiver-side algorithm that performs significantly better.

  6. Going End to End to Deliver High-Speed Data

    NASA Technical Reports Server (NTRS)

    2005-01-01

    By the end of the 1990s, the optical fiber "backbone" of the telecommunication and data-communication networks had evolved from megabits-per-second transmission rates to gigabits-per-second transmission rates. Despite this boom in bandwidth, however, users at the end nodes were still not being reached on a consistent basis. (An end node is any device that does not behave like a router or a managed hub or switch. Examples of end node objects are computers, printers, serial interface processor phones, and unmanaged hubs and switches.) The primary reason that prevents bandwidth from reaching the end nodes is the complex local network topology that exists between the optical backbone and the end nodes. This complex network topology consists of several layers of routing and switch equipment which introduce potential congestion points and network latency. By breaking down the complex network topology, a true optical connection can be achieved. Access Optical Networks, Inc., is making this connection a reality with guidance from NASA s nondestructive evaluation experts.

  7. Kepler Mission: End-to-End System Demonstration

    NASA Technical Reports Server (NTRS)

    Borucki, William; Koch, D.; Dunham, E.; Jenkins, J.; Witteborn, F.; Updike, T.; DeVincenzi, Donald L. (Technical Monitor)

    2000-01-01

    A test facility has been constructed to demonstrate the capability of differential ensemble photometry to detect transits of Earth-size planets orbiting solar-like stars. The main objective is to determine the effects of various noise sources on the capability of a CCD photometer to maintain a system relative precision of 1 x $10^(-5)$ for mv = 12 stars in the presence of system-induced noise sources. The facility includes a simulated star field, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft and computers to perform the onboard control, data processing and extraction. The test structure is thermally and mechanically isolated so that each source of noise can be introduced in a controlled fashion and evaluated for its contribution to the total noise budget. The effects of pointing errors or a changing thermal environment are imposed by piezo-electric devices. Transits are injected by heating small wires crossing apertures in the star plate. Signals as small as those from terrestrial-size transits of solar-like stars are introduced to demonstrate that such planets can be detected under realistic noise conditions. Examples of imposing several noise sources and the resulting detectabilities are presented. These show that a differential ensemble photometric approach CCD photometer can readily detect signals associated with Earth-size transits.

  8. Indian Remote Sensing Satellite - End-to-end data systems

    NASA Astrophysics Data System (ADS)

    Jayaraman, V.; Rajangam, R. K.

    The first satellite (IRS-1A) of the Indian Remote Sensing Satellite program, which was established for the purpose of management of the renewable and nonrenewable resources in India, will be launched by the end of 1987 and placed in polar sun-synchronous orbit at an altitude of 904 km. The IRS-1A payload consists of a set of CCD cameras designed to produce imagery in the visible and near-IR bands; the data will be transmitted to ground both in X band and S band. The spececraft control; the reception, recording, and processing of data; and data-product dissemination and archiving will be achieved by the ground control system. This paper describes the overall IRS-1A system, with particular attention given to data systems, tracing the process of data flow from acquisition to distribution. Diagrams illustrating the make-up of the IRS-1A mission, and the procedures of data acquisition and processing are presented together with the encoding and decoding algorithms.

  9. SU-F-18C-02: Evaluations of the Noise Power Spectrum of a CT Iterative Reconstruction Technique for Radiation Therapy

    SciTech Connect

    Dolly, S; Chen, H; Anastasio, M; Mutic, S; Li, H

    2014-06-15

    Purpose: To quantitatively assess the noise power spectrum (NPS) of the new, commercially released CT iterative reconstruction technique, iDose{sup 4} from Philips, to compare it with filtered back-projection techniques (FBP), and to provide clinical practice suggestions for radiation therapy. Methods: A uniform phantom was CT imaged with 120kVp tube potential over a range of mAs (250-3333). The image sets were reconstructed using two reconstruction algorithms (FBP and iDose{sup 4} with noise reduction levels 1, 3, and 6) and three reconstruction filters (standard B, smooth A, and sharp C), after which NPS variations were analyzed and compared on region of interest (ROI) sizes (16×16 to 128×128 pixels), ROI radii (0–65 mm), reconstruction algorithms, reconstruction filters, and mAs. Results: The NPS magnitude and shape depended considerably on ROI size and location for both reconstruction algorithms. Regional noise variance became more stationary as ROI size decreased, minimizing NPS artifacts. The optimal 32×32-pixel ROI size balanced the trade-off between stationary noise and adequate sampling. NPS artifacts were greatest at the center of reconstruction space and decreased with increasing ROI distance from the center. The optimal ROI position was located near the phantom's radial midpoint (∼40mm). For sharper filters, the NPS magnitude and the maximum magnitude frequency increased. Higher dose scans yielded lower NPS magnitudes for both reconstruction algorithms and all filters. Compared to FBP, the iDose{sup 4} algorithm reduced the NPS magnitude while preferentially reducing noise at mid-range spatial frequencies, altering noise texture. This reduction was more significant with increasing iDose{sup 4} noise reduction level. Conclusion: Compared to pixel standard deviation, NPS has greater clinical potential for task-based image quality assessment, describing both the magnitude and spatial frequency characteristics of image noise. While iDose{sup 4

  10. Reconstruction of Rayleigh-Lamb dispersion spectrum based on noise obtained from an air-jet forcing.

    PubMed

    Larose, Eric; Roux, Philippe; Campillo, Michel

    2007-12-01

    The time-domain cross correlation of incoherent and random noise recorded by a series of passive sensors contains the impulse response of the medium between these sensors. By using noise generated by a can of compressed air sprayed on the surface of a plexiglass plate, we are able to reconstruct not only the time of flight but the whole wave forms between the sensors. From the reconstruction of the direct A(0) and S(0) waves, we derive the dispersion curves of the flexural waves, thus estimating the mechanical properties of the material without a conventional electromechanical source. The dense array of receivers employed here allow a precise frequency-wavenumber study of flexural waves, along with a thorough evaluation of the rate of convergence of the correlation with respect to the record length, the frequency, and the distance between the receivers. The reconstruction of the actual amplitude and attenuation of the impulse response is also addressed in this paper.

  11. Cathodoluminescence Spectrum Imaging Software

    2011-04-07

    The software developed for spectrum imaging is applied to the analysis of the spectrum series generated by our cathodoluminescence instrumentation. This software provides advanced processing capabilities s such: reconstruction of photon intensity (resolved in energy) and photon energy maps, extraction of the spectrum from selected areas, quantitative imaging mode, pixel-to-pixel correlation spectrum line scans, ASCII, output, filling routines, drift correction, etc.

  12. Spectrum reconstruction using relative-deviation-based kernel regression in temporally and spatially modulated Fourier transform imaging spectrometer.

    PubMed

    Huang, Fengzhen; Yuan, Yan; Li, Jingzhen; Cao, Jun

    2015-08-01

    During the temporally and spatially modulated Fourier transform imaging spectrometer push-broom scanning process, the motion state of the spectrometer platform can vary. Thus, the target interferogram obtained from the image sequence deviates from the ideal interferogram obtained using high platform stability. The recovered target spectrum will not reflect the true target characteristics. We adopted target tracking to acquire the target position in the image sequence via a proposed kernel regression, with a relative deviation method for determining the target intensities, and the recovery of the spectrogram using the nonuniform fast Fourier transform algorithm. We tested our algorithm on simulated and experimentally obtained aerial images and, from comparison with accurate spectrograms, demonstrate the effectiveness of the proposed method.

  13. Reconstruction of the radionuclide spectrum of liquid radioactive waste released into the Techa river in 1949-1951.

    PubMed

    Mokrov, Yuri G

    2003-04-01

    The major part of the liquid radioactive waste released by the Mayak Production Association (PA) radiochemical plant into the Techa river occurred in 1949-1951, but there is information on only one single radiochemical analysis of a water sample taken on 24 and 25 September 1951. These data are here used to assess the spectrum of radionuclides that were released between 1949 and 1951. For this purpose, details of the radiochemical methods of radionuclide extraction and radiometric measurements of beta-activity used at Mayak PA in the 1950s have been taken into account. It is concluded that the data from the radiochemical measurements agree with the theoretical composition of fission products in uranium after exposure times in the reactor (120 days) and subsequent hold times (35 days) that were typical for the procedures at that time. The results of the analysis are at variance with assumptions that underlie the current Techa river dosimetry system. They confirm the conclusion that the external doses to the Techa river residents in the critical period up to 1952 were predominantly due to short-lived fission products.

  14. Channeled spectropolarimetry using iterative reconstruction

    NASA Astrophysics Data System (ADS)

    Lee, Dennis J.; LaCasse, Charles F.; Craven, Julia M.

    2016-05-01

    Channeled spectropolarimeters (CSP) measure the polarization state of light as a function of wavelength. Conventional Fourier reconstruction suffers from noise, assumes the channels are band-limited, and requires uniformly spaced samples. To address these problems, we propose an iterative reconstruction algorithm. We develop a mathematical model of CSP measurements and minimize a cost function based on this model. We simulate a measured spectrum using example Stokes parameters, from which we compare conventional Fourier reconstruction and iterative reconstruction. Importantly, our iterative approach can reconstruct signals that contain more bandwidth, an advancement over Fourier reconstruction. Our results also show that iterative reconstruction mitigates noise effects, processes non-uniformly spaced samples without interpolation, and more faithfully recovers the ground truth Stokes parameters. This work offers a significant improvement to Fourier reconstruction for channeled spectropolarimetry.

  15. The Kepler End-to-End Data Pipeline: From Photons to Far Away Worlds

    NASA Technical Reports Server (NTRS)

    Cooke, Brian; Thompson, Richard; Standley, Shaun

    2012-01-01

    The Kepler mission is described in overview and the Kepler technique for discovering exoplanets is discussed. The design and implementation of the Kepler spacecraft, tracing the data path from photons entering the telescope aperture through raw observation data transmitted to the ground operations team is described. The technical challenges of operating a large aperture photometer with an unprecedented 95 million pixel detector are addressed as well as the onboard technique for processing and reducing the large volume of data produced by the Kepler photometer. The technique and challenge of day-to-day mission operations that result in a very high percentage of time on target is discussed. This includes the day to day process for monitoring and managing the health of the spacecraft, the annual process for maintaining sun on the solar arrays while still keeping the telescope pointed at the fixed science target, the process for safely but rapidly returning to science operations after a spacecraft initiated safing event and the long term anomaly resolution process.The ground data processing pipeline, from the point that science data is received on the ground to the presentation of preliminary planetary candidates and supporting data to the science team for further evaluation is discussed. Ground management, control, exchange and storage of Kepler's large and growing data set is discussed as well as the process and techniques for removing noise sources and applying calibrations to intermediate data products.

  16. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells. PMID:23651286

  17. Moments of the End-to-End Vector of a Chain Molecule, Its Persistence and Distribution

    PubMed Central

    Flory, Paul J.

    1973-01-01

    The persistence vector a is defined as the configurational average of the chain vector r connecting the ends of the molecule and expressed in a reference frame fixed with respect to the first two skeletal bonds. Moments of second and higher orders in the components of r may readily be calculated for real chains in the rotational isomeric state approximation, and from them the corresponding moments of the vector ρ = r - a measured from the terminus of a. Development of the density distribution of ρ about a is proposed as an alternative to the customary treatment of the density distribution of r about r = 0 on the assumption that this latter distribution should be (approximately) symmetric. Past difficulties in the analysis of cyclization equilibria involving rings of moderate size, such as occur in single strands of polynucleotide chains, conceivably may be overcome by adoption of this alternative. PMID:16592094

  18. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    SciTech Connect

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-01-01

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  19. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    DOE PAGESBeta

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-01-01

    The advance of the scientific discovery process is accomplished by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally,more » it is important for scientists to be able to share their workflows with collaborators. Moreover we have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC), the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In our paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.« less

  20. Real-time software-based end-to-end wireless visual communications simulation platform

    NASA Astrophysics Data System (ADS)

    Chen, Ting-Chung; Chang, Li-Fung; Wong, Andria H.; Sun, Ming-Ting; Hsing, T. Russell

    1995-04-01

    Wireless channel impairments pose many challenges to real-time visual communications. In this paper, we describe a real-time software based wireless visual communications simulation platform which can be used for performance evaluation in real-time. This simulation platform consists of two personal computers serving as hosts. Major components of each PC host include a real-time programmable video code, a wireless channel simulator, and a network interface for data transport between the two hosts. The three major components are interfaced in real-time to show the interaction of various wireless channels and video coding algorithms. The programmable features in the above components allow users to do performance evaluation of user-controlled wireless channel effects without physically carrying out these experiments which are limited in scope, time-consuming, and costly. Using this simulation platform as a testbed, we have experimented with several wireless channel effects including Rayleigh fading, antenna diversity, channel filtering, symbol timing, modulation, and packet loss.

  1. Emergence of Laplace therapeutics: declaring an end to end-stage heart failure.

    PubMed

    Mehra, Mandeep R; Uber, Patricia A

    2002-01-01

    A large number of chronic heart failure patients escape from the benefits of neurohormonal blockade only to transit into a discouragingly miserable state of what the physician often refers to as end-stage heart failure. Conceptually, the designation of end-stage as a description of a clinical scenario implies pessimism concerning recourse to a therapeutic avenue. A variety of surgical therapeutic techniques that take advantage of the law of Laplace, designed to effectively restore the cardiac shape from a spherical, mechanically inefficient pump to a more elliptical, structurally sound organ are now being employed. Additionally, the field of mechanical device implantation is surging ahead at a rapid pace. The weight of evidence regarding mechanical unloading using assist devices suggests that hemodynamic restoration is accompanied by regression of cellular hypertrophy, normalization of the neuroendocrine axis, improved expression of contractile proteins, enhanced cellular respiratory control, and decreases in markers of apoptosis and cellular stress. Thus, these lines of data point toward discarding the notion of end-stage heart failure. We are at a new crossroad in our quest to tackle chronic heart failure. It is our contention that the use of antiremodeling strategies, including device approaches, will soon signal the end of end-stage heart failure.

  2. Assessing the Performance Limits of Internal Coronagraphs Through End-to-End Modeling

    NASA Technical Reports Server (NTRS)

    Krist, John E.; Belikov, Ruslan; Pueyo, Laurent; Mawet, Dimitri P.; Moody, Dwight; Trauger, John T.; Shaklan, Stuart B.

    2013-01-01

    As part of the NASA ROSES Technology Demonstrations for Exoplanet Missions (TDEM) program, we conducted a numerical modeling study of three internal coronagraphs (PIAA, vector vortex, hybrid bandlimited) to understand their behaviors in realistically-aberrated systems with wavefront control (deformable mirrors). This investigation consisted of two milestones: (1) develop wavefront propagation codes appropriate for each coronagraph that are accurate to 1% or better (compared to a reference algorithm) but are also time and memory efficient, and (2) use these codes to determine the wavefront control limits of each architecture. We discuss here how the milestones were met and identify some of the behaviors particular to each coronagraph. The codes developed in this study are being made available for community use. We discuss here results for the HBLC and VVC systems, with PIAA having been discussed in a previous proceeding.

  3. SciBox, an end-to-end automated science planning and commanding system

    NASA Astrophysics Data System (ADS)

    Choo, Teck H.; Murchie, Scott L.; Bedini, Peter D.; Steele, R. Josh; Skura, Joseph P.; Nguyen, Lillian; Nair, Hari; Lucks, Michael; Berman, Alice F.; McGovern, James A.; Turner, F. Scott

    2014-01-01

    SciBox is a new technology for planning and commanding science operations for Earth-orbital and planetary space missions. It has been incrementally developed since 2001 and demonstrated on several spaceflight projects. The technology has matured to the point that it is now being used to plan and command all orbital science operations for the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) mission to Mercury. SciBox encompasses the derivation of observing sequences from science objectives, the scheduling of those sequences, the generation of spacecraft and instrument commands, and the validation of those commands prior to uploading to the spacecraft. Although the process is automated, science and observing requirements are incorporated at each step by a series of rules and parameters to optimize observing opportunities, which are tested and validated through simulation and review. Except for limited special operations and tests, there is no manual scheduling of observations or construction of command sequences. SciBox reduces the lead time for operations planning by shortening the time-consuming coordination process, reduces cost by automating the labor-intensive processes of human-in-the-loop adjudication of observing priorities, reduces operations risk by systematically checking constraints, and maximizes science return by fully evaluating the trade space of observing opportunities to meet MESSENGER science priorities within spacecraft recorder, downlink, scheduling, and orbital-geometry constraints.

  4. End-to-End Network QoS via Scheduling of Flexible Resource Reservation Requests

    SciTech Connect

    Sharma, S.; Katramatos, D.; Yu, D.

    2011-11-14

    Modern data-intensive applications move vast amounts of data between multiple locations around the world. To enable predictable and reliable data transfer, next generation networks allow such applications to reserve network resources for exclusive use. In this paper, we solve an important problem (called SMR3) to accommodate multiple and concurrent network reservation requests between a pair of end-sites. Given the varying availability of bandwidth within the network, our goal is to accommodate as many reservation requests as possible while minimizing the total time needed to complete the data transfers. We first prove that SMR3 is an NP-hard problem. Then we solve it by developing a polynomial-time heuristic, called RRA. The RRA algorithm hinges on an efficient mechanism to accommodate large number of requests by minimizing the bandwidth wastage. Finally, via numerical results, we show that RRA constructs schedules that accommodate significantly larger number of requests compared to other, seemingly efficient, heuristics.

  5. End-to-end wireless TCP with noncongestion packet loss detection and handling

    NASA Astrophysics Data System (ADS)

    Lee, Jae-Joon; Liu, Fang; Kuo, C.-C. Jay

    2003-07-01

    Traditional TCP performance degrades over lossy links, as the TCP sender assumes that packet loss is caused by congestion in the network path and thus reduces the sending rate by cutting the congestion window multiplicatively, and a mechanism to overcome this limitation is investigated in this research. Our scheme identifies the network path condition to differentiate whether congestion happens or not, and responds differently. The basic idea of separating congestion and non-congestion caused losses is to compare the estimated current available bandwidth and the average available bandwidth. To minimize the effect of temporary fluctuation of measurements, we estimate the available bandwidth with a higher weight on stable measurements and a lower weight on unstable fluctuations. In our scheme, packet loss due to congestion invokes the TCP Newreno procedure. In cases of random loss that is not related to congestion, the multiplicative decrease of the sending rate is avoided to achieve higher throughput. In addition, each duplicate acknowledgement after a fast retransmission will increase the congestion window to fully recover its sending rate. Extensive simulation results show that our differentiation algorithm achieves high accuracy. Accordingly, the TCP connection over lossy link with the proposed scheme provides higher throughput than TCP Newreno.

  6. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  7. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Breidenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return missions, and approaching spacecraft in the vicinity of Mars, to demostrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out its own science investigations.

  8. End-to-end information system concept for the Mars Telecommunications Orbiter

    NASA Technical Reports Server (NTRS)

    Bridenthal, Julian C.; Edwards, Charles D.; Greenberg, Edward; Kazz, Greg J.; Noreen, Gary K.

    2006-01-01

    The Mars Telecommunications Orbiter (MTO) was intended to provide high-performance deep space relay links to landers, orbiters, sample-return, missions, and approaching spacecraft in the vicinity of Mars, to demonstrate interplanetary laser communications, to demonstrate autonomous navigation, and to carry out is own science investigations.

  9. End-to-end Encryption for SMS Messages in the Health Care Domain.

    PubMed

    Hassinen, Marko; Laitinen, Pertti

    2005-01-01

    The health care domain has a high level of expectation on security and privacy of patient information. The security, privacy, and confidentiality issues are consistent all over the domain. Technical development and increasing use of mobile phones has led us to a situation in which SMS messages are used in the electronic interactions between health care professionals and patients. We will show that it is possible to send, receive and store text messages securely with a mobile phone with no additional hardware required. More importantly we will show that it is possible to obtain a reliable user authentication in systems using text message communication. Programming language Java is used for realization of our goals. This paper describes the general application structure, while details for the technical implementation and encryption methods are described in the referenced articles. We also propose some crucial areas where the implementation of encrypted SMS can solve previous lack of security.

  10. Privacy in Pharmacogenetics: An End-to-End Case Study of Personalized Warfarin Dosing

    PubMed Central

    Fredrikson, Matthew; Lantz, Eric; Jha, Somesh; Lin, Simon; Page, David; Ristenpart, Thomas

    2014-01-01

    We initiate the study of privacy in pharmacogenetics, wherein machine learning models are used to guide medical treatments based on a patient’s genotype and background. Performing an in-depth case study on privacy in personalized warfarin dosing, we show that suggested models carry privacy risks, in particular because attackers can perform what we call model inversion: an attacker, given the model and some demographic information about a patient, can predict the patient’s genetic markers. As differential privacy (DP) is an oft-proposed solution for medical settings such as this, we evaluate its effectiveness for building private versions of pharmacogenetic models. We show that DP mechanisms prevent our model inversion attacks when the privacy budget is carefully selected. We go on to analyze the impact on utility by performing simulated clinical trials with DP dosing models. We find that for privacy budgets effective at preventing attacks, patients would be exposed to increased risk of stroke, bleeding events, and mortality. We conclude that current DP mechanisms do not simultaneously improve genomic privacy while retaining desirable clinical efficacy, highlighting the need for new mechanisms that should be evaluated in situ using the general methodology introduced by our work. PMID:27077138

  11. End-to-end observatory software modeling using domain specific languages

    NASA Astrophysics Data System (ADS)

    Filgueira, José M.; Bec, Matthieu; Liu, Ning; Peng, Chien; Soto, José

    2014-07-01

    The Giant Magellan Telescope (GMT) is a 25-meter extremely large telescope that is being built by an international consortium of universities and research institutions. Its software and control system is being developed using a set of Domain Specific Languages (DSL) that supports a model driven development methodology integrated with an Agile management process. This approach promotes the use of standardized models that capture the component architecture of the system, that facilitate the construction of technical specifications in a uniform way, that facilitate communication between developers and domain experts and that provide a framework to ensure the successful integration of the software subsystems developed by the GMT partner institutions.

  12. End-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1982-01-01

    The efficiency of various deep space communication systems which are required to transmit both imaging and a typically error sensitive class of data called general science and engineering (gse) are compared. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an advanced imaging communication system (AICS) which exhibits the rather significant advantages of sophisticated data compression coupled with powerful yet practical channel coding. For example, under certain conditions the improved AICS efficiency could provide as much as two orders of magnitude increase in imaging information rate compared to a single channel uncoded, uncompressed system while maintaining the same gse data rate in both systems. Additional details describing AICS compression and coding concepts as well as efforts to apply them are provided in support of the system analysis.

  13. From End to End: tRNA Editing at 5'- and 3'-Terminal Positions

    PubMed Central

    Betat, Heike; Long, Yicheng; Jackman, Jane E.; Mörl, Mario

    2014-01-01

    During maturation, tRNA molecules undergo a series of individual processing steps, ranging from exo- and endonucleolytic trimming reactions at their 5'- and 3'-ends, specific base modifications and intron removal to the addition of the conserved 3'-terminal CCA sequence. Especially in mitochondria, this plethora of processing steps is completed by various editing events, where base identities at internal positions are changed and/or nucleotides at 5'- and 3'-ends are replaced or incorporated. In this review, we will focus predominantly on the latter reactions, where a growing number of cases indicate that these editing events represent a rather frequent and widespread phenomenon. While the mechanistic basis for 5'- and 3'-end editing differs dramatically, both reactions represent an absolute requirement for generating a functional tRNA. Current in vivo and in vitro model systems support a scenario in which these highly specific maturation reactions might have evolved out of ancient promiscuous RNA polymerization or quality control systems. PMID:25535083

  14. End-to-End Flow Control Using PI Controller for Servo Control over Networks

    NASA Astrophysics Data System (ADS)

    Yashiro, Daisuke; Kubo, Ryogo; Yakoh, Takahiro; Ohnishi, Kouhei

    This paper presents a novel flow control method using a PI controller for servo control over networks. The UDP is known to be effective for motion control systems over networks such as bilateral teleoperation. However, UDP does not have a mechanism for congestion avoidance. The congestion, which causes large communication delay, jitter, and packet loss, deteriorates the performance and stability of control systems over networks. To avoid this congestion, a novel flow control method, which adjusts a packet-sending period in real time, is proposed. The validity of the proposed method is shown by simulation and experimental results.

  15. Science and Applications Space Platform (SASP) End-to-End Data System Study

    NASA Technical Reports Server (NTRS)

    Crawford, P. R.; Kasulka, L. H.

    1981-01-01

    The capability of present technology and the Tracking and Data Relay Satellite System (TDRSS) to accommodate Science and Applications Space Platforms (SASP) payload user's requirements, maximum service to the user through optimization of the SASP Onboard Command and Data Management System, and the ability and availability of new technology to accommodate the evolution of SASP payloads were assessed. Key technology items identified to accommodate payloads on a SASP were onboard storage devices, multiplexers, and onboard data processors. The primary driver is the limited access to TDRSS for single access channels due to sharing with all the low Earth orbit spacecraft plus shuttle. Advantages of onboard data processing include long term storage of processed data until TRDSS is accessible, thus reducing the loss of data, eliminating large data processing tasks at the ground stations, and providing a more timely access to the data.

  16. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  17. Telerobotics and orbital laboratories: an end-to-end analysis and demonstration.

    PubMed

    Konkel, C R; Miller, C F

    1989-10-01

    A preliminary analysis of the United States Laboratory (USL) module of the International Space Station has been completed. A major conclusion was that one of the most limited resources within the USL will be crew time. A laboratory robot would alleviate these constraints, improve safety, and reduce operational costs. A laboratory experiment manipulator system (LEMS) is proposed, made up of an on-board mobile manipulator and a computer-assisted operator control station. The on-board manipulator concept was tested with an Intelledex 660 industrial robot. Operator joystick command capability and delayed video feedback were added to simulate a Space Station Teleoperation system. The implementation of a unique predictive display was chosen for further evaluation because of its promise as a partial solution to the classical problem of robot remote control in the presence of time delay. The incorporation of various correction factors to calibrate the robot predictor model, including geometric distortion and spherical aberration caused by the video optics, is described.

  18. Assessing Natural Product-Drug Interactions: An End-to-End Safety Framework.

    PubMed

    Roe, Amy L; Paine, Mary F; Gurley, Bill J; Brouwer, Kenneth R; Jordan, Scott; Griffiths, James C

    2016-04-01

    The use of natural products (NPs), including herbal medicines and other dietary supplements, by North Americans continues to increase across all age groups. This population has access to conventional medications, with significant polypharmacy observed in older adults. Thus, the safety of the interactions between multi-ingredient NPs and drugs is a topic of paramount importance. Considerations such as history of safe use, literature data from animal toxicity and human clinical studies, and NP constituent characterization would provide guidance on whether to assess NP-drug interactions experimentally. The literature is replete with reports of various NP extracts and constituents as potent inhibitors of drug metabolizing enzymes, and transporters. However, without standard methods for NP characterization or in vitro testing, extrapolating these reports to clinically-relevant NP-drug interactions is difficult. This lack of a clear definition of risk precludes clinicians and consumers from making informed decisions about the safety of taking NPs with conventional medications. A framework is needed that describes an integrated robust approach for assessing NP-drug interactions; and, translation of the data into formulation alterations, dose adjustment, labelling, and/or post-marketing surveillance strategies. A session was held at the 41st Annual Summer Meeting of the Toxicology Forum in Colorado Springs, CO, to highlight the challenges and critical components that should be included in a framework approach.

  19. An End-to-End Architecture for Science Goal Driven Observing

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Koratkar, Anuradha; Memarsadeghi, Nargess; Wolf, Karl; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    New observatories will have greater on-board storage capacity and on-board processing capabilities. The new bottleneck will be download capacity. The cost of downlink time and limitations of bandwidth will end the era where all exposure data is downloaded and all data processing is performed on the ground. In addition, observing campaigns involving inherently variable targets will need scheduling flexibility to focus observing time and data download on exposures that are scientifically interesting. The ability to quickly recognize and react to such events by re-prioritizing the observing schedule will be an essential characteristic for maximizing scientific returns. It will also be a step towards increasing spacecraft autonomy, a major goal of NASA's strategic plan. The science goal monitoring (SGM) system is a proof-of-concept effort to address these challenges. We are developing an interactive distributed system that will use on-board processing and storage combined with event-driven interfaces with ground-based processing and operations, to enable fast re-prioritization of observing schedules, and to minimize time spent on non-optimized observations. SGM is initially aimed towards time-tagged observing modes used frequently in spectroscopic studies of varying targets. In particular, the SGM is collaborating with the proposed MIDEX-class mission Kronos team. The variable targets that Kronos seeks to study make an adaptive system such as SGM particularly valuable for achieving mission goals. However, the architecture and interfaces will also be designed for easy adaptability to other observing platforms, including ground-based systems and to work with different scheduling and pipeline processing systems. This talk will focus on our strategy for developing SGM and the technical challenges that we have encountered. We will discuss the SGM architecture as it applies to the Kronos mission and explain how it is scalable to other missions.

  20. Impact of advanced onboard processing concepts on end-to-end data system

    NASA Technical Reports Server (NTRS)

    Sos, J. Y.

    1978-01-01

    An investigation is conducted of the impact of advanced onboard data handling concepts on the total system in general and on ground processing operations, such as those being performed in the central data processing facility of the NASA Goddard Space Flight Center. In one of these concepts, known as the instrument telemetry packet (ITP) system, telemetry data from a single instrument is encoded into a packet, along with other ancillary data, and transmitted in this form to the ground. Another concept deals with onboard temporal registration of image data from such sensors as the thematic mapper, to be carried onboard the Landsat-D spacecraft in 1981. It is found that the implementation of the considered concepts will result in substantial simplification of the ground processing element of the system. With the projected tenfold increase in the data volume expected in the next decade, the introduction of ITP should keep the cost of the ground data processing function within reasonable bounds and significantly contribute to a more timely delivery of data/information to the end user.

  1. An end-to-end workflow for engineering of biological networks from high-level specifications.

    PubMed

    Beal, Jacob; Weiss, Ron; Densmore, Douglas; Adler, Aaron; Appleton, Evan; Babb, Jonathan; Bhatia, Swapnil; Davidsohn, Noah; Haddock, Traci; Loyall, Joseph; Schantz, Richard; Vasilev, Viktor; Yaman, Fusun

    2012-08-17

    We present a workflow for the design and production of biological networks from high-level program specifications. The workflow is based on a sequence of intermediate models that incrementally translate high-level specifications into DNA samples that implement them. We identify algorithms for translating between adjacent models and implement them as a set of software tools, organized into a four-stage toolchain: Specification, Compilation, Part Assignment, and Assembly. The specification stage begins with a Boolean logic computation specified in the Proto programming language. The compilation stage uses a library of network motifs and cellular platforms, also specified in Proto, to transform the program into an optimized Abstract Genetic Regulatory Network (AGRN) that implements the programmed behavior. The part assignment stage assigns DNA parts to the AGRN, drawing the parts from a database for the target cellular platform, to create a DNA sequence implementing the AGRN. Finally, the assembly stage computes an optimized assembly plan to create the DNA sequence from available part samples, yielding a protocol for producing a sample of engineered plasmids with robotics assistance. Our workflow is the first to automate the production of biological networks from a high-level program specification. Furthermore, the workflow's modular design allows the same program to be realized on different cellular platforms simply by swapping workflow configurations. We validated our workflow by specifying a small-molecule sensor-reporter program and verifying the resulting plasmids in both HEK 293 mammalian cells and in E. coli bacterial cells.

  2. End-to-End Data Movement Using MPI-IO Over Routed Terabots Infrastructures

    SciTech Connect

    Vallee, Geoffroy R; Atchley, Scott; Kim, Youngjae; Shipman, Galen M

    2013-01-01

    Scientific discovery is nowadays driven by large-scale simulations running on massively parallel high-performance computing (HPC) systems. These applications each generate a large amount of data, which then needs to be post-processed for example for data mining or visualization. Unfortunately, the computing platform used for post processing might be different from the one on which the data is initially generated, introducing the challenge of moving large amount of data between computing platforms. This is especially challenging when these two platforms are geographically separated since the data needs to be moved between computing facilities. This is even more critical when scientists tightly couple their domain specific applications with a post processing application. The paper presents a solution for the data transfer between MPI applications using a dedicated wide area network (WAN) terabit infrastructure. The proposed solution is based on parallel access to data files and the Message Passing Interface (MPI) over the Common Communication Infrastructure (CCI) for the data transfer over a routed infrastructure. In the context of this research, the Energy Sciences Network (ESnet) of the U.S. Department of Energy (DOE) is targeted for the transfer of data between DOE national laboratories.

  3. Data compression: The end-to-end information systems perspective for NASA space science missions

    NASA Technical Reports Server (NTRS)

    Tai, Wallace

    1991-01-01

    The unique characteristics of compressed data have important implications to the design of space science data systems, science applications, and data compression techniques. The sequential nature or data dependence between each of the sample values within a block of compressed data introduces an error multiplication or propagation factor which compounds the effects of communication errors. The data communication characteristics of the onboard data acquisition, storage, and telecommunication channels may influence the size of the compressed blocks and the frequency of included re-initialization points. The organization of the compressed data are continually changing depending on the entropy of the input data. This also results in a variable output rate from the instrument which may require buffering to interface with the spacecraft data system. On the ground, there exist key tradeoff issues associated with the distribution and management of the science data products when data compression techniques are applied in order to alleviate the constraints imposed by ground communication bandwidth and data storage capacity.

  4. An end-to-end assessment of extreme weather impacts on food security

    NASA Astrophysics Data System (ADS)

    Chavez, Erik; Conway, Gordon; Ghil, Michael; Sadler, Marc

    2015-11-01

    Both governments and the private sector urgently require better estimates of the likely incidence of extreme weather events, their impacts on food crop production and the potential consequent social and economic losses. Current assessments of climate change impacts on agriculture mostly focus on average crop yield vulnerability to climate and adaptation scenarios. Also, although new-generation climate models have improved and there has been an exponential increase in available data, the uncertainties in their projections over years and decades, and at regional and local scale, have not decreased. We need to understand and quantify the non-stationary, annual and decadal climate impacts using simple and communicable risk metrics that will help public and private stakeholders manage the hazards to food security. Here we present an `end-to-end’ methodological construct based on weather indices and machine learning that integrates current understanding of the various interacting systems of climate, crops and the economy to determine short- to long-term risk estimates of crop production loss, in different climate and adaptation scenarios. For provinces north and south of the Yangtze River in China, we have found that risk profiles for crop yields that translate climate into economic variability follow marked regional patterns, shaped by drivers of continental-scale climate. We conclude that to be cost-effective, region-specific policies have to be tailored to optimally combine different categories of risk management instruments.

  5. Independent SCPS-TP development for fault-tolerant, end-to-end communication architectures

    NASA Astrophysics Data System (ADS)

    Edwards, E.; Lamorie, J.; Younghusband, D.; Brunet, C.; Hartman, L.

    2002-07-01

    A fully networked architecture provides for the distribution of computing elements, of all mission components, through the spacecraft. Each node is individually addressable through the network, and behaves as an independent entity. This level of communication also supports individualized Command and Data Handling (C&DH), as well as one-to-one transactions between spacecraft nodes and individual ground segment users. To be effective, fault-tolerance must be applied at the network data transport level, as well as the supporting layers below it. If the network provides fail-safe characteristics independent of the mission applications being executed, then developers need not build in their own systems to ensure network reliability. The Space Communications Protocol Standards (SCPS) were developed to provide robust communications in a space environment, while retaining compatibility with Internet data transport at the ground segment. Although SCPS is a standard of the Consultative Committee for Space Data Systems (CCSDS), the adoption of SCPS was initially delayed by US export regulations that prevented the distribution of reference code. This paper describes the development and test of a fully independent implementation of the SCSP Transport Protocol, SCPS-TP, which has been derived directly from the CCSDS specification. The performance of the protocol is described for a set of geostationary satellite tests, and these results will be compared with those derived from network simulation and laboratory emulation. The work is placed in the context of a comprehensive, fault-tolerant network that potentially surpasses the failsafe performance of a traditional spacecraft control system under similar circumstances.

  6. Implementation of an End-to-End Simulator for the BepiColombo Rotation Experiment

    NASA Astrophysics Data System (ADS)

    Palli, A.; Bevilacqua, A.; Genova, A.; Gherardi, A.; Iess, L.; Meriggiola, R.; Tortora, P.

    2012-09-01

    Fundamental information on the interior of Mercury can be inferred from its rotational state, in terms of obliquity and amplitude of physical libration in longitude. For this reason a dedicated Rotation Experiment will be performed by the ESA mission BepiColombo. A system-level experiment simulator has been developed in order to optimize the observation strategy and is here presented. In particular, this abstract will focus on the estimation process, the optimization algorithms and the selection of optimal pattern matching strategies.

  7. Sensory Recovery Outcome after Digital Nerve Repair in Relation to Different Reconstructive Techniques: Meta-Analysis and Systematic Review

    PubMed Central

    Wolf, Petra; Harder, Yves; Kern, Yasmin; Paprottka, Philipp M.; Machens, Hans-Günther; Lohmeyer, Jörn A.

    2013-01-01

    Good clinical outcome after digital nerve repair is highly relevant for proper hand function and has a significant socioeconomic impact. However, level of evidence for competing surgical techniques is low. The aim is to summarize and compare the outcomes of digital nerve repair with different methods (end-to-end and end-to-side coaptations, nerve grafts, artificial conduit-, vein-, muscle, and muscle-in-vein reconstructions, and replantations) to provide an aid for choosing an individual technique of nerve reconstruction and to create reference values of standard repair for nonrandomized clinical studies. 87 publications including 2,997 nerve repairs were suitable for a precise evaluation. For digital nerve repairs there was practically no particular technique superior to another. Only end-to-side coaptation had an inferior two-point discrimination in comparison to end-to-end coaptation or nerve grafting. Furthermore, this meta-analysis showed that youth was associated with an improved sensory recovery outcome in patients who underwent digital replantation. For end-to-end coaptations, recent publications had significantly better sensory recovery outcomes than older ones. Given minor differences in outcome, the main criteria in choosing an adequate surgical technique should be gap length and donor site morbidity caused by graft material harvesting. Our clinical experience was used to provide a decision tree for digital nerve repair. PMID:23984064

  8. Multi time-step wavefront reconstruction for tomographic adaptive-optics systems.

    PubMed

    Ono, Yoshito H; Akiyama, Masayuki; Oya, Shin; Lardiére, Olivier; Andersen, David R; Correia, Carlos; Jackson, Kate; Bradley, Colin

    2016-04-01

    In tomographic adaptive-optics (AO) systems, errors due to tomographic wavefront reconstruction limit the performance and angular size of the scientific field of view (FoV), where AO correction is effective. We propose a multi time-step tomographic wavefront reconstruction method to reduce the tomographic error by using measurements from both the current and previous time steps simultaneously. We further outline the method to feed the reconstructor with both wind speed and direction of each turbulence layer. An end-to-end numerical simulation, assuming a multi-object AO (MOAO) system on a 30 m aperture telescope, shows that the multi time-step reconstruction increases the Strehl ratio (SR) over a scientific FoV of 10 arc min in diameter by a factor of 1.5-1.8 when compared to the classical tomographic reconstructor, depending on the guide star asterism and with perfect knowledge of wind speeds and directions. We also evaluate the multi time-step reconstruction method and the wind estimation method on the RAVEN demonstrator under laboratory setting conditions. The wind speeds and directions at multiple atmospheric layers are measured successfully in the laboratory experiment by our wind estimation method with errors below 2  ms-1. With these wind estimates, the multi time-step reconstructor increases the SR value by a factor of 1.2-1.5, which is consistent with a prediction from the end-to-end numerical simulation. PMID:27140785

  9. Popliteal pseudoaneurysm after arthroscopic posterior cruciate ligament reconstruction.

    PubMed

    van Dorp, Karin B; Breugem, Stefan J M; Driessen, Marcel J M

    2014-09-01

    This report presents the case of a 30-year-old motocross (BMX) cyclist with a third-degree posterior cruciate ligament rupture. The technique used for reconstruction was the transtibial single-bundle autologous hamstring technique. Unfortunately, the procedure was complicated by a popliteal pseudoaneurysm, which was located in line with the tibial canal. The pseudoaneurysm was treated with an end-to-end anastomosis and the patient recovered without further complaints. In this case, the popliteal artery was damaged most probably by the edge of the reamer or the guide wire during removal. Vascular complications can be limb- and life-threatening. This case report aims to increase the awareness of this serious complication with a review of the literature. PMID:25229050

  10. Zellweger Spectrum

    MedlinePlus

    ... the Zellweger spectrum result from defects in the assembly of a cellular structure called the peroxisome, and ... Zellweger spectrum are caused by defects in the assembly of the peroxisome. There are at least 12 ...

  11. Primordial power spectrum from Planck

    SciTech Connect

    Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun E-mail: arman@apctp.org

    2014-11-01

    Using modified Richardson-Lucy algorithm we reconstruct the primordial power spectrum (PPS) from Planck Cosmic Microwave Background (CMB) temperature anisotropy data. In our analysis we use different combinations of angular power spectra from Planck to reconstruct the shape of the primordial power spectrum and locate possible features. Performing an extensive error analysis we found the dip near ℓ ∼ 750–850 represents the most prominent feature in the data. Feature near ℓ ∼ 1800–2000 is detectable with high confidence only in 217 GHz spectrum and is apparently consequence of a small systematic as described in the revised Planck 2013 papers. Fixing the background cosmological parameters and the foreground nuisance parameters to their best fit baseline values, we report that the best fit power law primordial power spectrum is consistent with the reconstructed form of the PPS at 2σ C.L. of the estimated errors (apart from the local features mentioned above). As a consistency test, we found the reconstructed primordial power spectrum from Planck temperature data can also substantially improve the fit to WMAP-9 angular power spectrum data (with respect to power-law form of the PPS) allowing an overall amplitude shift of ∼ 2.5%. In this context low-ℓ and 100 GHz spectrum from Planck which have proper overlap in the multipole range with WMAP data found to be completely consistent with WMAP-9 (allowing amplitude shift). As another important result of our analysis we do report the evidence of gravitational lensing through the reconstruction analysis. Finally we present two smooth form of the PPS containing only the important features. These smooth forms of PPS can provide significant improvements in fitting the data (with respect to the power law PPS) and can be helpful to give hints for inflationary model building.

  12. Iterative reconstruction methods in atmospheric tomography: FEWHA, Kaczmarz and Gradient-based algorithm

    NASA Astrophysics Data System (ADS)

    Ramlau, R.; Saxenhuber, D.; Yudytskiy, M.

    2014-07-01

    The problem of atmospheric tomography arises in ground-based telescope imaging with adaptive optics (AO), where one aims to compensate in real-time for the rapidly changing optical distortions in the atmosphere. Many of these systems depend on a sufficient reconstruction of the turbulence profiles in order to obtain a good correction. Due to steadily growing telescope sizes, there is a strong increase in the computational load for atmospheric reconstruction with current methods, first and foremost the MVM. In this paper we present and compare three novel iterative reconstruction methods. The first iterative approach is the Finite Element- Wavelet Hybrid Algorithm (FEWHA), which combines wavelet-based techniques and conjugate gradient schemes to efficiently and accurately tackle the problem of atmospheric reconstruction. The method is extremely fast, highly flexible and yields superior quality. Another novel iterative reconstruction algorithm is the three step approach which decouples the problem in the reconstruction of the incoming wavefronts, the reconstruction of the turbulent layers (atmospheric tomography) and the computation of the best mirror correction (fitting step). For the atmospheric tomography problem within the three step approach, the Kaczmarz algorithm and the Gradient-based method have been developed. We present a detailed comparison of our reconstructors both in terms of quality and speed performance in the context of a Multi-Object Adaptive Optics (MOAO) system for the E-ELT setting on OCTOPUS, the ESO end-to-end simulation tool.

  13. Optimal reconstruction for closed-loop ground-layer adaptive optics with elongated spots.

    PubMed

    Béchet, Clémentine; Tallon, Michel; Tallon-Bosc, Isabelle; Thiébaut, Éric; Le Louarn, Miska; Clare, Richard M

    2010-11-01

    The design of the laser-guide-star-based adaptive optics (AO) systems for the Extremely Large Telescopes requires careful study of the issue of elongated spots produced on Shack-Hartmann wavefront sensors. The importance of a correct modeling of the nonuniformity and correlations of the noise induced by this elongation has already been demonstrated for wavefront reconstruction. We report here on the first (to our knowledge) end-to-end simulations of closed-loop ground-layer AO with laser guide stars with such an improved noise model. The results are compared with the level of performance predicted by a classical noise model for the reconstruction. The performance is studied in terms of ensquared energy and confirms that, thanks to the improved noise model, central or side launching of the lasers does not affect the performance with respect to the laser guide stars' flux. These two launching schemes also perform similarly whatever the atmospheric turbulence strength.

  14. Optimal reconstruction for closed-loop ground-layer adaptive optics with elongated spots.

    PubMed

    Béchet, Clémentine; Tallon, Michel; Tallon-Bosc, Isabelle; Thiébaut, Éric; Le Louarn, Miska; Clare, Richard M

    2010-11-01

    The design of the laser-guide-star-based adaptive optics (AO) systems for the Extremely Large Telescopes requires careful study of the issue of elongated spots produced on Shack-Hartmann wavefront sensors. The importance of a correct modeling of the nonuniformity and correlations of the noise induced by this elongation has already been demonstrated for wavefront reconstruction. We report here on the first (to our knowledge) end-to-end simulations of closed-loop ground-layer AO with laser guide stars with such an improved noise model. The results are compared with the level of performance predicted by a classical noise model for the reconstruction. The performance is studied in terms of ensquared energy and confirms that, thanks to the improved noise model, central or side launching of the lasers does not affect the performance with respect to the laser guide stars' flux. These two launching schemes also perform similarly whatever the atmospheric turbulence strength. PMID:21045872

  15. Spectrum Recombination.

    ERIC Educational Resources Information Center

    Greenslade, Thomas B., Jr.

    1984-01-01

    Describes several methods of executing lecture demonstrations involving the recombination of the spectrum. Groups the techniques into two general classes: bringing selected portions of the spectrum together using lenses or mirrors and blurring the colors by rapid movement or foreshortening. (JM)

  16. Bayesian reconstruction of projection reconstruction NMR (PR-NMR).

    PubMed

    Yoon, Ji Won

    2014-11-01

    Projection reconstruction nuclear magnetic resonance (PR-NMR) is a technique for generating multidimensional NMR spectra. A small number of projections from lower-dimensional NMR spectra are used to reconstruct the multidimensional NMR spectra. In our previous work, it was shown that multidimensional NMR spectra are efficiently reconstructed using peak-by-peak based reversible jump Markov chain Monte Carlo (RJMCMC) algorithm. We propose an extended and generalized RJMCMC algorithm replacing a simple linear model with a linear mixed model to reconstruct close NMR spectra into true spectra. This statistical method generates samples in a Bayesian scheme. Our proposed algorithm is tested on a set of six projections derived from the three-dimensional 700 MHz HNCO spectrum of a protein HasA. PMID:25218584

  17. Project Reconstruct.

    ERIC Educational Resources Information Center

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  18. Vaginal reconstruction

    SciTech Connect

    Lesavoy, M.A.

    1985-05-01

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients.

  19. Unsupervised malaria parasite detection based on phase spectrum.

    PubMed

    Fang, Yuming; Xiong, Wei; Lin, Weisi; Chen, Zhenzhong

    2011-01-01

    In this paper, we propose a novel method for malaria parasite detection based on phase spectrum. The method first obtains the amplitude spectrum and phase spectrum for blood smear images through Quaternion Fourier Transform (QFT). Then it gets the reconstructed image based on Inverse Quaternion Fourier transform (IQFT) on a constant amplitude spectrum and the original phase spectrum. The malaria parasite areas can be detected easily from the reconstructed blood smear images. Extensive experiments have demonstrated the effectiveness of this novel method. PMID:22256196

  20. Successful Reconstruction of Nerve Defects Using Distraction Neurogenesis with a New Experimental Device

    PubMed Central

    Yousef, Mohamed Abdelhamid Ali; Dionigi, Paolo; Marconi, Stefania; Calligaro, Alberto; Cornaglia, Antonia Icaro; Alfonsi, Enrico; Auricchio, Ferdinando

    2015-01-01

    Introduction: Repair of peripheral nerve injuries is an intensive area of challenge and research in modern reconstructive microsurgery. Intensive research is being carried out to develop effective alternatives to the standard nerve autografting, avoiding its drawbacks. The aim of the study was to evaluate the effectiveness of a newly designed mechanical device for the reconstruction of the sciatic nerve in rats in comparison to nerve autografting and to assess the pain during the period of distraction neurogenesis. Methods: Fourteen Sprague Dawley rats were used and randomly assigned into 2 groups with 7 rats in each group; group A (Nerve Autografting group) in which a 10-mm segment of the sciatic nerve was resected and rotated 180 degrees, then primary end-to-end neurorrhaphy was performed in the reverse direction; group B (Nerve Lengthening group) in which the mechanical device was inserted after surgical resection of 10 mm of the sciatic nerve, then secondary end-to-end neurorrhaphy was performed after completing the nerve lengthening. Thirteen weeks later, assessment of the functional sciatic nerve recovery using static sciatic index (SSI) was performed. Furthermore, fourteen weeks after the nerve resection, assessment of the nerve regeneration with electrophysiological study and histological analysis were performed. Also, gastrocnemius wet weight was measured. For pain assessment in group B, Rat Grimace Scale (RGS) score was used. Results: Significantly better functional recovery rate (using the SSI) was reported in the nerve lengthening group in comparison to autografting group. Also, a statistically significant higher nerve conduction velocity was detected in the nerve lengthening group. On histological analysis of the distal nerve section at 3 mm distal to the nerve repair site, significant myelin sheath thickness was detected in the nerve lengthening group. Discussion: Distraction neurogenesis with the new experimental device is a reliable therapeutic

  1. Reconstruction of portal vein and superior mesenteric vein after extensive resection for pancreatic cancer

    PubMed Central

    Kim, Suh Min; Park, Daedo; Min, Sang-Il; Jang, Jin-Young; Kim, Sun-Whe; Ha, Jongwon; Kim, Sang Joon

    2013-01-01

    Purpose Tumor invasion to the portal vein (PV) or superior mesenteric vein (SMV) can be encountered during the surgery for pancreatic cancer. Venous reconstruction is required, but the optimal surgical methods and conduits remain in controversies. Methods From January 2007 to July 2012, 16 venous reconstructions were performed during surgery for pancreatic cancer in 14 patients. We analyzed the methods, conduits, graft patency, and patient survival. Results The involved veins were 14 SMVs and 2 PVs. The operative methods included resection and end-to-end anastomosis in 7 patients, wedge resection with venoplasty in 2 patients, bovine patch repair in 3 patients, and interposition graft with bovine patch in 1 patient. In one patient with a failed interposition graft with great saphenous vein (GSV), the SMV was reconstructed with a prosthetic interposition graft, which was revised with a spiral graft of GSV. Vascular morbidity occurred in 4 cases; occlusion of an interposition graft with GSV or polytetrafluoroethylene, segmental thrombosis and stenosis of the SMV after end-to-end anastomosis. Patency was maintained in patients with bovine patch angioplasty and spiral vein grafts. With mean follow-up of 9.8 months, the 6- and 12-month death-censored graft survival rates were both 81.3%. Conclusion Many of the involved vein segments were repaired primarily. When tension-free anastomosis is impossible, the spiral grafts with GSV or bovine patch grafts are good options to overcome the size mismatch between autologous vein graft and portomesenteric veins. Further follow-up of these patients is needed to demonstrate long-term patency. PMID:23741692

  2. An integrated healthcare information system for end-to-end standardized exchange and homogeneous management of digital ECG formats.

    PubMed

    Trigo, Jesús Daniel; Martínez, Ignacio; Alesanco, Alvaro; Kollmann, Alexander; Escayola, Javier; Hayn, Dieter; Schreier, Günter; García, José

    2012-07-01

    This paper investigates the application of the enterprise information system (EIS) paradigm to standardized cardiovascular condition monitoring. There are many specifications in cardiology, particularly in the ECG standardization arena. The existence of ECG formats, however, does not guarantee the implementation of homogeneous, standardized solutions for ECG management. In fact, hospital management services need to cope with various ECG formats and, moreover, several different visualization applications. This heterogeneity hampers the normalization of integrated, standardized healthcare information systems, hence the need for finding an appropriate combination of ECG formats and a suitable EIS-based software architecture that enables standardized exchange and homogeneous management of ECG formats. Determining such a combination is one objective of this paper. The second aim is to design and develop the integrated healthcare information system that satisfies the requirements posed by the previous determination. The ECG formats selected include ISO/IEEE11073, Standard Communications Protocol for Computer-Assisted Electrocardiography, and an ECG ontology. The EIS-enabling techniques and technologies selected include web services, simple object access protocol, extensible markup language, or business process execution language. Such a selection ensures the standardized exchange of ECGs within, or across, healthcare information systems while providing modularity and accessibility.

  3. End-To-End Solution for Integrated Workload and Data Management using GlideinWMS and Globus Online

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Miller, Zachary; Kettimuthu, Rajkumar; Garzoglio, Gabriele; Holzman, Burt; Weiss, Cathrin; Duan, Xi; Lacinski, Lukasz

    2012-12-01

    Grid computing has enabled scientific communities to effectively share computing resources distributed over many independent sites. Several such communities, or Virtual Organizations (VO), in the Open Science Grid and the European Grid Infrastructure use the GlideinWMS system to run complex application work-flows. GlideinWMS is a pilot-based workload management system (WMS) that creates an on-demand, dynamically-sized overlay Condor batch system on Grid resources. While the WMS addresses the management of compute resources, however, data management in the Grid is still the responsibility of the VO. In general, large VOs have resources to develop complex custom solutions, while small VOs would rather push this responsibility to the infrastructure. The latter requires a tight integration of the WMS and the data management layers, an approach still not common in modern Grids. In this paper we describe a solution developed to address this shortcoming in the context of Center for Enabling Distributed Peta-scale Science (CEDPS) by integrating GlideinWMS with Globus Online (GO). Globus Online is a fast, reliable file transfer service that makes it easy for any user to move data. The solution eliminates the need for the users to provide custom data transfer solutions in the application by making this functionality part of the GlideinWMS infrastructure. To achieve this, GlideinWMS uses the file transfer plug-in architecture of Condor. The paper describes the system architecture and how this solution can be extended to support data transfer services other than Globus Online when used with Condor or GlideinWMS.

  4. Portable air quality sensor unit for participatory monitoring: an end-to-end VESNA-AQ based prototype

    NASA Astrophysics Data System (ADS)

    Vucnik, Matevz; Robinson, Johanna; Smolnikar, Miha; Kocman, David; Horvat, Milena; Mohorcic, Mihael

    2015-04-01

    Key words: portable air quality sensor, CITI-SENSE, participatory monitoring, VESNA-AQ The emergence of low-cost easy to use portable air quality sensors units is opening new possibilities for individuals to assess their exposure to air pollutants at specific place and time, and share this information through the Internet connection. Such portable sensors units are being used in an ongoing citizen science project called CITI-SENSE, which enables citizens to measure and share the data. The project aims through creating citizens observatories' to empower citizens to contribute to and participate in environmental governance, enabling them to support and influence community and societal priorities as well as associated decision making. An air quality measurement system based on VESNA sensor platform was primarily designed within the project for the use as portable sensor unit in selected pilot cities (Belgrade, Ljubljana and Vienna) for monitoring outdoor exposure to pollutants. However, functionally the same unit with different set of sensors could be used for example as an indoor platform. The version designed for the pilot studies was equipped with the following sensors: NO2, O3, CO, temperature, relative humidity, pressure and accelerometer. The personal sensor unit is battery powered and housed in a plastic box. The VESNA-based air quality (AQ) monitoring system comprises the VESNA-AQ portable sensor unit, a smartphone app and the remote server. Personal sensor unit supports wireless connection to an Android smartphone via built-in Wi-Fi. The smartphone in turn serves also as the communication gateway towards the remote server using any of available data connections. Besides the gateway functionality the role of smartphone is to enrich data coming from the personal sensor unit with the GPS location, timestamps and user defined context. This, together with an accelerometer, enables the user to better estimate ones exposure in relation to physical activities, time and location. The end user can monitor the measured parameters through a smartphone application. The smartphone app implements a custom developed LCSP (Lightweight Client Server Protocol) protocol which is used to send requests to the VESNA-AQ unit and to exchange information. When the data is obtained from the VESNA-AQ unit, the mobile application visualizes the data. It also has an option to forward the data to the remote server in a custom JSON structure over a HTTP POST request. The server stores the data in the database and in parallel translates the data to WFS and forwards it to the main CITI-SENSE platform over WFS-T in a common XML format over HTTP POST request. From there data can be accessed through the Internet and visualised in different forms and web applications developed by the CITI-SENSE project. In the course of the project, the collected data will be made publicly available enabling the citizens to participate in environmental governance. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524 (www.citi-sense.eu).

  5. Towards a Software Framework to Support Deployment of Low Cost End-to-End Hydroclimatological Sensor Network

    NASA Astrophysics Data System (ADS)

    Celicourt, P.; Piasecki, M.

    2015-12-01

    Deployment of environmental sensors assemblies based on cheap platforms such as Raspberry Pi and Arduino have gained much attention over the past few years. While they are more attractive due to their ability to be controlled with a few programming language choices, the configuration task can become quite complex due to the need of having to learn several different proprietary data formats and protocols which constitute a bottleneck for the expansion of sensor network. In response to this rising complexity the Institute of Electrical and Electronics Engineers (IEEE) has sponsored the development of the IEEE 1451 standard in an attempt to introduce a common standard. The most innovative concept of the standard is the Transducer Electronic Data Sheet (TEDS) which enables transducers to self-identify, self-describe, self-calibrate, to exhibit plug-and-play functionality, etc. We used Python to develop an IEEE 1451.0 platform-independent graphical user interface to generate and provide sufficient information about almost ANY sensor and sensor platforms for sensor programming purposes, automatic calibration of sensors data, incorporation of back-end demands on data management in TEDS for automatic standard-based data storage, search and discovery purposes. These features are paramount to make data management much less onerous in large scale sensor network. Along with the TEDS Creator, we developed a tool namely HydroUnits for three specific purposes: encoding of physical units in the TEDS, dimensional analysis, and on-the-fly conversion of time series allowing users to retrieve data in a desired equivalent unit while accommodating unforeseen and user-defined units. In addition, our back-end data management comprises the Python/Django equivalent of the CUAHSI Observations Data Model (ODM) namely DjangODM that will be hosted by a MongoDB Database Server which offers more convenience for our application. We are also developing a data which will be paired with the data autoloading capability of Django and a TEDS processing script to populate the database with the incoming data. The Python WaterOneFlow Web Services developed by the Texas Water Development Board will be used to publish the data. The software suite is being tested on the Raspberry Pi as end node and a laptop PC as the base station in a wireless setting.

  6. The NOAO Data Products Program: Developing an End-to-End Data Management System in Support of the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Smith, R. C.; Boroson, T.; Seaman, R.

    2007-10-01

    The NOAO Data Products Program (DPP) is responsible for the development and operation of the data management system for NOAO and affiliated observatories, and for the scientific support of users accessing our data holdings and using our tools and services. At the core of this mission is the capture of data from instruments at these observatories and the delivery of that content to both the Principle Investigators (PIs) who proposed for the observations and, after an appropriate proprietary period, to users worldwide who are interested in using the data for their own (often very different) scientific projects. However, delivery of raw and/or reduced images to users only scratches the surface of the extensive potential which the international Virtual Observatory (VO) initiative has to offer. By designing the whole NOAO/DPP program around not only VO standards, but more importantly around VO principles, the program becomes not an exercise in data management and NOAO user support, but rather a VO-centric program which serves the growing world-wide VO community. It is this more global aspect that drives NOAO/DPP planning, as well as more specifically the design, development, and operations of the various components of our system. In the following sections we discuss these components and how they work together to form our VO-centric program.

  7. Building the tree of life from scratch: an end-to-end work flow for phylogenomic studies

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Whole genome sequences are rich sources of information about organisms that are superbly useful for addressing a wide variety of evolutionary questions. Recent progress in genomics has enabled the de novo decoding of the genome of virtually any organism, greatly expanding its potential for understan...

  8. Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs

    NASA Astrophysics Data System (ADS)

    Gratadour, Damien

    2011-09-01

    Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.

  9. Parameterizations of truncated food web models from the perspective of an end-to-end model approach

    NASA Astrophysics Data System (ADS)

    Fennel, Wolfgang

    2009-02-01

    Modeling of marine ecosystems is broadly divided into two branches: biogeochemical processes and fish production. The biogeochemical models see the fish only implicitly by mortality rates, while fish production models see the lower food web basically through prescribed food, e.g., copepod biomass. The skill assessment of ecological models, which are usually truncated biogeochemical models, also involves the question of how the effects of the missing higher food web are parameterized. This paper contributes to the goal of bridging biogeochemical models and fish-production models by employing a recently developed coupled NPZDF-model, Fennel [Fennel, W., 2007. Towards bridging biogeochemical and fish production models. Journal of Marine Systems, doi:10.1016/j.jmarsys.2007.06.008]. Here we study parameterizations of truncated NPZD-models from the viewpoint of a complete model. The effects of the higher food web on the cycling of the state variables in a truncated NPZD-model cannot be unambiguously imitated. For example, one can mimic effects of fishery by export fluxes of one of the state variables. It is shown that the mass fluxes between the lower and upper part of the full model food web are significantly smaller than the fluxes within the NPZD-model. However, over longer time scales, relatively small changes can accumulate and eventually become important.

  10. End-to-end conformational communication through a synthetic purinergic receptor by ligand-induced helicity switching

    NASA Astrophysics Data System (ADS)

    Brown, Robert A.; Diemer, Vincent; Webb, Simon J.; Clayden, Jonathan

    2013-10-01

    The long-range communication of information, exemplified by signal transduction through membrane-bound receptors, is a central biochemical function. Reversible binding of a messenger ligand induces a local conformational change that is relayed through the receptor, inducing a chemical effect typically several nanometres from the binding site. We report a synthetic receptor mimic that transmits structural information from a boron-based ligand binding site to a spectroscopic reporter located more than 2 nm away. Reversible binding of a diol ligand to the N-terminal binding site induces a screw-sense preference in a helical oligo(aminoisobutyric acid) foldamer, which is relayed to a reporter group at the remote C-terminus, communicating information about the structure and stereochemistry of the ligand. The reversible nature of boronate esterification was exploited to switch the receptor sequentially between left- and right-handed helices, while the exquisite conformational sensitivity of the helical relay allowed the reporter to differentiate even between purine and pyrimidine nucleosides as ligands.

  11. End-to-end 9-D polarized bunch transport in eRHIC energy-recovery recirculator, some aspects

    SciTech Connect

    Meot, F.; Meot, F.; Brooks, S.; Ptitsyn, V.; Trbojevic, D.; Tsoupas, N.

    2015-05-03

    This paper is a brief overview of some of the numerous beam and spin dynamics investigations undertaken in the framework of the design of the FFAG based electron energy recovery re-circulator ring of the eRHIC electron-ion collider project

  12. Scalability Analysis and Use of Compression at the Goddard DAAC and End-to-End MODIS Transfers

    NASA Technical Reports Server (NTRS)

    Menasce, Daniel A.

    1998-01-01

    The goal of this task is to analyze the performance of single and multiple FTP transfer between SCF's and the Goddard DAAC. We developed an analytic model to compute the performance of FTP sessions as a function of various key parameters, implemented the model as a program called FTP Analyzer, and carried out validations with real data obtained by running single and multiple FTP transfer between GSFC and the Miami SCF. The input parameters to the model include the mix to FTP sessions (scenario), and for each FTP session, the file size. The network parameters include the round trip time, packet loss rate, the limiting bandwidth of the network connecting the SCF to a DAAC, TCP's basic timeout, TCP's Maximum Segment Size, and TCP's Maximum Receiver's Window Size. The modeling approach used consisted of modeling TCP's overall throughput, computing TCP's delay per FTP transfer, and then solving a queuing network model that includes the FTP clients and servers.

  13. SPAN: A Network Providing Integrated, End-to-End, Sensor-to-Database Solutions for Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Benzel, T.; Cho, Y. H.; Deschon, A.; Gullapalli, S.; Silva, F.

    2009-12-01

    In recent years, advances in sensor network technology have shown great promise to revolutionize environmental data collection. Still, wide spread adoption of these systems by domain experts has been lacking, and these have remained the purview of the engineers who design them. While there are many data logging options for basic data collection in the field currently, scientists are often required to visit the deployment sites to retrieve their data and manually import it into spreadsheets. Some advanced commercial software systems do allow scientists to collect data remotely, but most of these systems only allow point-to-point access, and require proprietary hardware. Furthermore, these commercial solutions preclude the use of sensors from other manufacturers or integration with internet based database repositories and compute engines. Therefore, scientists often must download and manually reformat their data before uploading it to the repositories if they wish to share their data. We present an open-source, low-cost, extensible, turnkey solution called Sensor Processing and Acquisition Network (SPAN) which provides a robust and flexible sensor network service. At the deployment site, SPAN leverages low-power generic embedded processors to integrate variety of commercially available sensor hardware to the network of environmental observation systems. By bringing intelligence close to the sensed phenomena, we can remotely control configuration and re-use, establish rules to trigger sensor activity, manage power requirements, and control the two-way flow of sensed data as well as control information to the sensors. Key features of our design include (1) adoption of a hardware agnostic architecture: our solutions are compatible with several programmable platforms, sensor systems, communication devices and protocols. (2) information standardization: our system supports several popular communication protocols and data formats, and (3) extensible data support: our system works with several existing data storage systems, data models and web based services as needed by the domain experts; examples include standard MySQL databases, Sensorbase (from UCLA), as well as SPAN Cloud, a system built using Google's Application Engine that allows scientists to use Google's cloud computing cyber-infrastructure. We provide a simple, yet flexible data access control mechanism that allows groups of researchers to share their data in SPAN Cloud. In this talk, we will describe the SPAN architecture, its components, our development plans, our vision for the future and results from current deployments that continue to drive the design of our system.

  14. End-to-End Design, Development and Testing of GOES-R Level 1 and 2 Algorithms

    NASA Astrophysics Data System (ADS)

    Zaccheo, T.; Copeland, A.; Steinfelt, E.; Van Rompay, P.; Werbos, A.

    2012-12-01

    GOES-R is the next generation of the National Oceanic and Atmospheric Administration's (NOAA) Geostationary Operational Environmental Satellite (GOES) System, and it represents a new technological era in operational geostationary environmental satellite systems. GOES-R will provide advanced products, based on government-supplied algorithms, which describe the state of the atmosphere, land, and oceans over the Western Hemisphere. The Harris GOES-R Core Ground Segment (GS) Team will provide the ground processing software and infrastructure needed to produce and distribute these data products. As part of this effort, new or updated Level 1b and Level 2+ algorithms will be deployed in the GOES-R Product Generation (PG) Element. In this work, we describe the general approach currently being employed to migrate these Level 1b (L1b) and Level 2+ (L2+) GOES-R PG algorithms from government-provided scientific descriptions to their implementation as integrated software, and provide an overview of how Product Generation software works with the other elements of the Ground Segment to produce Level 1/Level 2+ end-products. In general, GOES-R L1b algorithms ingest reformatted raw sensor data and ancillary information to produce geo-located GOES-R L1b data, and GOES-R L2+ algorithms ingest L1b data and other ancillary/auxiliary/intermediate information to produce L2+ products such as aerosol optical depth, rainfall rate, derived motion winds, and snow cover. In this presentation we provide an overview of the Algorithm development life cycle, the common Product Generation software architecture, and the common test strategies used to verify/validate the scientific implementation. This work will highlight the Software Integration and Test phase of the software life-cycle and the suite of automated test/analysis tools developed to insure the implemented algorithms meet desired reproducibility. As part of this discussion we will summarize the results of our algorithm testing to date, and provide illustrated examples from our ongoing algorithm implementation.

  15. METERON end-to-end Network for Robotic Experiments: Objectives and first operations at B.USOC.

    NASA Astrophysics Data System (ADS)

    This, N.; Michel, A.; Litefti, K.; Muller, C.; Moreau, D.

    2012-09-01

    METERON an international collaboration between ESA, NASA (University of Colorado), Roskosmos and DLR. It intends to use the ISS as a test bed to simulate an orbiter around another heavenly body (for example Mars), under directives from Mission Control on Earth. Astronauts on the orbiter will project their human initiative and instinct, in realtime, onto the surface of the heavenly body (simulated by an analog site on the Earth) through robotic device(s) to perform science or engineering tasks. This type of real-time control is not possible directly from Earth due to the One Way Light Time delay in communications. METERON operations are managed by B.USOC since December 2011 as Facility Reference Centre.

  16. End-to-end remote sensing at the Science and Technology Laboratory of John C. Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Kelly, Patrick; Rickman, Douglas; Smith, Eric

    1991-01-01

    The Science and Technology Laboratory (STL) of Stennis Space Center (SSC) was developing an expertise in remote sensing for more than a decade. Capabilities at SSC/STL include all major areas of the field. STL includes the Sensor Development Laboratory (SDL), Image Processing Center, a Learjet 23 flight platform, and on-staff scientific investigators.

  17. Update on ORNL TRANSFORM Tool: Simulating Multi-Module Advanced Reactor with End-to-End I&C

    SciTech Connect

    Hale, Richard Edward; Fugate, David L.; Cetiner, Sacit M.; Qualls, A. L.

    2015-05-01

    The Small Modular Reactor (SMR) Dynamic System Modeling Tool project is in the fourth year of development. The project is designed to support collaborative modeling and study of various advanced SMR (non-light water cooled reactor) concepts, including the use of multiple coupled reactors at a single site. The focus of this report is the development of a steam generator and drum system model that includes the complex dynamics of typical steam drum systems, the development of instrumentation and controls for the steam generator with drum system model, and the development of multi-reactor module models that reflect the full power reactor innovative small module design concept. The objective of the project is to provide a common simulation environment and baseline modeling resources to facilitate rapid development of dynamic advanced reactor models; ensure consistency among research products within the Instrumentation, Controls, and Human-Machine Interface technical area; and leverage cross-cutting capabilities while minimizing duplication of effort. The combined simulation environment and suite of models are identified as the TRANSFORM tool. The critical elements of this effort include (1) defining a standardized, common simulation environment that can be applied throughout the Advanced Reactors Technology program; (2) developing a library of baseline component modules that can be assembled into full plant models using available geometry, design, and thermal-hydraulic data; (3) defining modeling conventions for interconnecting component models; and (4) establishing user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  18. SU-E-T-268: Proton Radiosurgery End-To-End Testing Using Lucy 3D QA Phantom

    SciTech Connect

    Choi, D; Gordon, I; Ghebremedhin, A; Wroe, A; Schulte, R; Bush, D; Slater, J; Patyal, B

    2014-06-01

    Purpose: To check the overall accuracy of proton radiosurgery treatment delivery using ready-made circular collimator inserts and fixed thickness compensating boluses. Methods: Lucy 3D QA phantom (Standard Imaging Inc. WI, USA) inserted with GaFchromicTM film was irradiated with laterally scattered and longitudinally spread-out 126.8 MeV proton beams. The tests followed every step in the proton radiosurgery treatment delivery process: CT scan (GE Lightspeed VCT), target contouring, treatment planning (Odyssey 5.0, Optivus, CA), portal calibration, target localization using robotic couch with image guidance and dose delivery at planned gantry angles. A 2 cm diameter collimator insert in a 4 cm diameter radiosurgery cone and a 1.2 cm thick compensating flat bolus were used for all beams. Film dosimetry (RIT114 v5.0, Radiological Imaging Technology, CO, USA) was used to evaluate the accuracy of target localization and relative dose distributions compared to those calculated by the treatment planning system. Results: The localization accuracy was estimated by analyzing the GaFchromic films irradiated at gantry 0, 90 and 270 degrees. We observed 0.5 mm shift in lateral direction (patient left), ±0.9 mm shift in AP direction and ±1.0 mm shift in vertical direction (gantry dependent). The isodose overlays showed good agreement (<2mm, 50% isodose lines) between measured and calculated doses. Conclusion: Localization accuracy depends on gantry sag, CT resolution and distortion, DRRs from treatment planning computer, localization accuracy of image guidance system, fabrication of ready-made aperture and cone housing. The total deviation from the isocenter was 1.4 mm. Dose distribution uncertainty comes from distal end error due to bolus and CT density, in addition to localization error. The planned dose distribution was well matched (>90%) to the measured values 2%/2mm criteria. Our test showed the robustness of our proton radiosurgery treatment delivery system using ready-made collimator inserts and fixed thickness compensating boluses.

  19. Pancreatectomy for metastasis to the pancreas from colorectal cancer and reconstruction of superior mesenteric vein: a case report

    PubMed Central

    2011-01-01

    Introduction Tumors of the pancreatic head can infiltrate the superior mesenteric vein. In such cases, the deep veins of the lower limbs can serve as suitable autologous conduits for superior mesenteric vein reconstruction after its resection. Few data exist, however, describing the technique and the immediate patency of such reconstruction. Case report We present the case of a 70-year-old Caucasian man with a metachronous metastasis of colon cancer and infiltration of the uncinate pancreatic process, on the anterior surface of which the tumor was located. En bloc resection of the tumor was performed with resection of the superior mesenteric vein and reconstruction. A 10 cm segment of the superficial femoral vein was harvested for the reconstruction. The superficial femoral vein segment was inter-positioned in an end-to-end fashion. The post-operative conduit patency was documented ultrasonographically immediately post-operatively and after a six-month period. The vein donor limb presented subtle signs of post-operative venous hypertension with edema, which was managed with compression stockings and led to significant improvement after six months. Conclusion In cases of exploratory laparotomies with high clinical suspicion of pancreatic involvement, the potential need for vascular reconstruction dictates the preparation for leg vein harvest in advance. The superficial femoral vein provides a suitable conduit for the reconstruction of the superior mesenteric vein. This report supports the uncomplicated nature of this technique, since few data exist about this type of reconstruction. PMID:21880120

  20. Enhanced compressive wideband frequency spectrum sensing for dynamic spectrum access

    NASA Astrophysics Data System (ADS)

    Liu, Yipeng; Wan, Qun

    2012-12-01

    Wideband spectrum sensing detects the unused spectrum holes for dynamic spectrum access (DSA). Too high sampling rate is the main challenge. Compressive sensing (CS) can reconstruct sparse signal with much fewer randomized samples than Nyquist sampling with high probability. Since survey shows that the monitored signal is sparse in frequency domain, CS can deal with the sampling burden. Random samples can be obtained by the analog-to-information converter. Signal recovery can be formulated as the combination of an L0 norm minimization and a linear measurement fitting constraint. In DSA, the static spectrum allocation of primary radios means the bounds between different types of primary radios are known in advance. To incorporate this a priori information, we divide the whole spectrum into sections according to the spectrum allocation policy. In the new optimization model, the minimization of the L2 norm of each section is used to encourage the cluster distribution locally, while the L0 norm of the L2 norms is minimized to give sparse distribution globally. Because the L2/L0 optimization is not convex, an iteratively re-weighted L2/L1 optimization is proposed to approximate it. Simulations demonstrate the proposed method outperforms others in accuracy, denoising ability, etc.

  1. Innovations in diabetic foot reconstruction using supermicrosurgery.

    PubMed

    Suh, Hyun Suk; Oh, Tae Suk; Hong, Joon Pio

    2016-01-01

    The treatment of diabetic foot ulceration is complex with multiple factors involved, and it may often lead to limb amputation. Hence, a multidisciplinary approach is warranted to cover the spectrum of treatment for diabetic foot, but in complex wounds, surgical treatment is inevitable. Surgery may involve the decision to preserve the limb by reconstruction or to amputate it. Reconstruction involves preserving the limb with secure coverage. Local flaps usually are able to provide sufficient coverage for small or moderate sized wound, but for larger wounds, soft tissue coverage involves flaps that are distantly located from the wound. Reconstruction of distant flap usually involves microsurgery, and now, further innovative methods such as supermicrosurgery have further given complex wounds a better chance to be reconstructed and limbs salvaged. This article reviews the microsurgery involved in reconstruction and introduces the new method of supermicrosurgery.

  2. Fission Spectrum

    DOE R&D Accomplishments Database

    Bloch, F.; Staub, H.

    1943-08-18

    Measurements of the spectrum of the fission neutrons of 25 are described, in which the energy of the neutrons is determined from the ionization produced by individual hydrogen recoils. The slow neutrons producing fission are obtained by slowing down the fast neutrons from the Be-D reaction of the Stanford cyclotron. In order to distinguish between fission neutrons and the remaining fast cyclotron neutrons both the cyclotron current and the pusle amplifier are modulated. A hollow neutron container, in which slow neutrons have a lifetime of about 2 milliseconds, avoids the use of large distances. This method results in much higher intensities than the usual modulation arrangement. The results show a continuous distribution of neutrons with a rather wide maximum at about 0.8 MV falling off to half of its maximum value at 2.0 MV. The total number of netrons is determined by comparison with the number of fission fragments. The result seems to indicate that only about 30% of the neutrons have energies below .8 MV. Various tests are described which were performed in order to rule out modification of the spectrum by inelastic scattering. Decl. May 4, 1951

  3. Integration of real-time 3D capture, reconstruction, and light-field display

    NASA Astrophysics Data System (ADS)

    Zhang, Zhaoxing; Geng, Zheng; Li, Tuotuo; Pei, Renjing; Liu, Yongchun; Zhang, Xiao

    2015-03-01

    Effective integration of 3D acquisition, reconstruction (modeling) and display technologies into a seamless systems provides augmented experience of visualizing and analyzing real objects and scenes with realistic 3D sensation. Applications can be found in medical imaging, gaming, virtual or augmented reality and hybrid simulations. Although 3D acquisition, reconstruction, and display technologies have gained significant momentum in recent years, there seems a lack of attention on synergistically combining these components into a "end-to-end" 3D visualization system. We designed, built and tested an integrated 3D visualization system that is able to capture in real-time 3D light-field images, perform 3D reconstruction to build 3D model of the objects, and display the 3D model on a large autostereoscopic screen. In this article, we will present our system architecture and component designs, hardware/software implementations, and experimental results. We will elaborate on our recent progress on sparse camera array light-field 3D acquisition, real-time dense 3D reconstruction, and autostereoscopic multi-view 3D display. A prototype is finally presented with test results to illustrate the effectiveness of our proposed integrated 3D visualization system.

  4. CMB temperature lensing power reconstruction

    SciTech Connect

    Hanson, Duncan; Efstathiou, George; Challinor, Anthony; Bielewicz, Pawel

    2011-02-15

    We study the reconstruction of the lensing potential power spectrum from CMB temperature data, with an eye to the Planck experiment. We work with the optimal quadratic estimator of Okamoto and Hu, which we characterize thoroughly in an application to the reconstruction of the lensing power spectrum. We find that at multipoles L<250, our current understanding of this estimator is biased at the 15% level by beyond-gradient terms in the Taylor expansion of lensing effects. We present the full lensed trispectrum to fourth order in the lensing potential to explain this effect. We show that the low-L bias, as well as a previously known bias at high L, is relevant to the determination of cosmology and must be corrected for in order to avoid significant parameter errors. We also investigate the covariance of the reconstructed power, finding broad correlations of {approx_equal}0.1%. Finally, we discuss several small improvements which may be made to the optimal estimator to mitigate these problems.

  5. The stability of spectrum reproduction by LEDs

    NASA Astrophysics Data System (ADS)

    Yang, Hua; Li, Jing; Yao, Ran; Lu, Pengzhi; Pei, Yanrong

    2015-09-01

    Spectral power distribution together with color consistency and constancy of natural light is studied and simulated before the white-light LED systems are fabricated to reproduce the natural light. The model with 3, 4, 6 and more primary LEDs based on the real measured spectrum and theoretical spectrum are analyzed. The spectral power sensitivity relation between the LEDs with different wavelength and color characteristic is analyzed. This research simplifies the approach of visible spectrum reconstruction which is an efficient way to use in the design and realization of LED-based luminaire.

  6. Sky reconstruction for the Tianlai cylinder array

    NASA Astrophysics Data System (ADS)

    Zhang, Jiao; Zuo, Shi-Fan; Ansari, Reza; Chen, Xuelei; Li, Yi-Chao; Wu, Feng-Quan; Campagne, Jean-Eric; Magneville, Christophe

    2016-10-01

    We apply our sky map reconstruction method for transit type interferometers to the Tianlai cylinder array. The method is based on spherical harmonic decomposition, and can be applied to a cylindrical array as well as dish arrays and we can compute the instrument response, synthesized beam, transfer function and noise power spectrum. We consider cylinder arrays with feed spacing larger than half a wavelength and, as expected, we find that the arrays with regular spacing have grating lobes which produce spurious images in the reconstructed maps. We show that this problem can be overcome using arrays with a different feed spacing on each cylinder. We present the reconstructed maps, and study the performance in terms of noise power spectrum, transfer function and beams for both regular and irregular feed spacing configurations.

  7. Transsplenic portal vein reconstruction-transjugular intrahepatic portosystemic shunt in a patient with portal and splenic vein thrombosis.

    PubMed

    Salsamendi, Jason T; Gortes, Francisco J; Shnayder, Michelle; Doshi, Mehul H; Fan, Ji; Narayanan, Govindarajan

    2016-09-01

    Portal vein thrombosis (PVT) is a potential complication of cirrhosis and can worsen outcomes after liver transplant (LT). Portal vein reconstruction-transjugular intrahepatic portosystemic shunt (PVR-TIPS) can restore flow through the portal vein (PV) and facilitate LT by avoiding complex vascular conduits. We present a case of transsplenic PVR-TIPS in the setting of complete PVT and splenic vein (SV) thrombosis. The patient had a 3-year history of PVT complicated by abdominal pain, ascites, and paraesophageal varices. A SV tributary provided access to the main SV and was punctured percutaneously under ultrasound scan guidance. PV access, PV and SV venoplasty, and TIPS placement were successfully performed without complex techniques. The patient underwent LT with successful end-to-end anastomosis of the PVs. Our case suggests transsplenic PVR-TIPS to be a safe and effective alternative to conventional PVR-TIPS in patients with PVT and SV thrombosis.

  8. Transsplenic portal vein reconstruction-transjugular intrahepatic portosystemic shunt in a patient with portal and splenic vein thrombosis.

    PubMed

    Salsamendi, Jason T; Gortes, Francisco J; Shnayder, Michelle; Doshi, Mehul H; Fan, Ji; Narayanan, Govindarajan

    2016-09-01

    Portal vein thrombosis (PVT) is a potential complication of cirrhosis and can worsen outcomes after liver transplant (LT). Portal vein reconstruction-transjugular intrahepatic portosystemic shunt (PVR-TIPS) can restore flow through the portal vein (PV) and facilitate LT by avoiding complex vascular conduits. We present a case of transsplenic PVR-TIPS in the setting of complete PVT and splenic vein (SV) thrombosis. The patient had a 3-year history of PVT complicated by abdominal pain, ascites, and paraesophageal varices. A SV tributary provided access to the main SV and was punctured percutaneously under ultrasound scan guidance. PV access, PV and SV venoplasty, and TIPS placement were successfully performed without complex techniques. The patient underwent LT with successful end-to-end anastomosis of the PVs. Our case suggests transsplenic PVR-TIPS to be a safe and effective alternative to conventional PVR-TIPS in patients with PVT and SV thrombosis. PMID:27594947

  9. Neuromagnetic source reconstruction

    SciTech Connect

    Lewis, P.S.; Mosher, J.C.; Leahy, R.M.

    1994-12-31

    In neuromagnetic source reconstruction, a functional map of neural activity is constructed from noninvasive magnetoencephalographic (MEG) measurements. The overall reconstruction problem is under-determined, so some form of source modeling must be applied. We review the two main classes of reconstruction techniques-parametric current dipole models and nonparametric distributed source reconstructions. Current dipole reconstructions use a physically plausible source model, but are limited to cases in which the neural currents are expected to be highly sparse and localized. Distributed source reconstructions can be applied to a wider variety of cases, but must incorporate an implicit source, model in order to arrive at a single reconstruction. We examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms. We conclude with a brief discussion of alternative non-Gaussian approachs.

  10. Autism Spectrum Disorder

    MedlinePlus

    ... Awards Enhancing Diversity Find People About NINDS NINDS Autism Spectrum Disorder Information Page Condensed from Autism Spectrum ... en Español Additional resources from MedlinePlus What is Autism Spectrum Disorder? Autistic disorder (sometimes called autism or ...

  11. Nipple and areola reconstruction.

    PubMed

    Hutcheson, H A; Bostwick, J

    1989-01-01

    Nipple-areola reconstruction is an integral part of breast reconstruction. Optimum results are usually obtained when nipple-areola reconstruction is staged after the breast mound has attained its final shape and is well vascularized. The use of intradermal tattoo allows the use of a variety of nonpigmented donor sites. Women report that reconstruction of the nipple-areola enhances their overall satisfaction with breast reconstruction. The knowledgeable and skilled nurse is a valuable member of the professional team during this final phase of breast reconstruction. PMID:2479039

  12. Breast Reconstruction after Mastectomy

    PubMed Central

    Schmauss, Daniel; Machens, Hans-Günther; Harder, Yves

    2016-01-01

    Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays, breast reconstruction should be individualized at its best, first of all taking into consideration not only the oncological aspects of the tumor, neo-/adjuvant treatment, and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction), as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue), the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction. PMID:26835456

  13. Reoperative midface reconstruction.

    PubMed

    Acero, Julio; García, Eloy

    2011-02-01

    Reoperative reconstruction of the midface is a challenging issue because of the complexity of this region and the severity of the aesthetic and functional sequela related to the absence or failure of a primary reconstruction. The different situations that can lead to the indication of a reoperative reconstructive procedure after previous oncologic ablative procedures in the midface are reviewed. Surgical techniques, anatomic problems, and limitations affecting the reoperative reconstruction in this region of the head and neck are discussed.

  14. [Breast reconstruction after mastectomy].

    PubMed

    Ho Quoc, C; Delay, E

    2013-02-01

    The mutilating surgery for breast cancer causes deep somatic and psychological sequelae. Breast reconstruction can mitigate these effects and permit the patient to help rebuild their lives. The purpose of this paper is to focus on breast reconstruction techniques and on factors involved in breast reconstruction. The methods of breast reconstruction are presented: objectives, indications, different techniques, operative risks, and long-term monitoring. Many different techniques can now allow breast reconstruction in most patients. Clinical cases are also presented in order to understand the results we expect from a breast reconstruction. Breast reconstruction provides many benefits for patients in terms of rehabilitation, wellness, and quality of life. In our mind, breast reconstruction should be considered more as an opportunity and a positive choice (the patient can decide to do it), than as an obligation (that the patient would suffer). The consultation with the surgeon who will perform the reconstruction is an important step to give all necessary informations. It is really important that the patient could speak again with him before undergoing reconstruction, if she has any doubt. The quality of information given by medical doctors is essential to the success of psychological intervention. This article was written in a simple, and understandable way to help gynecologists giving the best information to their patients. It is maybe also possible to let them a copy of this article, which would enable them to have a written support and would facilitate future consultation with the surgeon who will perform the reconstruction.

  15. Performances of JEM-EUSO: angular reconstruction. The JEM-EUSO Collaboration

    NASA Astrophysics Data System (ADS)

    Adams, J. H.; Ahmad, S.; Albert, J.-N.; Allard, D.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Arai, Y.; Asano, K.; Ave Pernas, M.; Baragatti, P.; Barrillon, P.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Blaksley, C.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Blümer, J.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Briggs, M. S.; Briz, S.; Bruno, A.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellinic, G.; Catalano, C.; Catalano, G.; Cellino, A.; Chikawa, M.; Christl, M. J.; Cline, D.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; de Castro, A. J.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Dell'Oro, A.; De Simone, N.; Di Martino, M.; Distratis, G.; Dulucq, F.; Dupieux, M.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Falk, S.; Fang, K.; Fenu, F.; Fernández-Gómez, I.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Franceschi, A.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; Garipov, G.; Geary, J.; Gelmini, G.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guzmán, A.; Hachisu, Y.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Insolia, A.; Isgrò, F.; Itow, Y.; Joven, E.; Judd, E. G.; Jung, A.; Kajino, F.; Kajino, T.; Kaneko, I.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Keilhauer, B.; Khrenov, B. A.; Kim, J.-S.; Kim, S.-W.; Kim, S.-W.; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lee, J.; Licandro, J.; Lim, H.; López, F.; Maccarone, M. C.; Mannheim, K.; Maravilla, D.; Marcelli, L.; Marini, A.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Medina-Tanco, G.; Mernik, T.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Murakami, M. Nagano; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Panasyuk, M. I.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perez Cano, S.; Peter, T.; Picozza, P.; Pierog, T.; Piotrowski, L. W.; Piraino, S.; Plebaniak, Z.; Pollini, A.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Reardon, P.; Reyes, M.; Ricci, M.; Rodríguez, I.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez-Cano, G.; Sagawa, H.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sánchez, S.; Santangelo, A.; Santiago Crúz, L.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Silva López, H. H.; Sledd, J.; Słomińska, K.; Sobey, A.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Trillaud, F.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Valore, L.; Vankova, G.; Vigorito, C.; Villaseñor, L.; von Ballmoos, P.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J.; Weber, M.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, K.; Yoshida, S.; Young, R.; Zotov, M. Yu.; Zuccaro Marchi, A.

    2015-11-01

    Mounted on the International Space Station(ISS), the Extreme Universe Space Observatory, on-board the Japanese Experimental Module (JEM-EUSO), relies on the well established fluorescence technique to observe Extensive Air Showers (EAS) developing in the earth's atmosphere. Focusing on the detection of Ultra High Energy Cosmic Rays (UHECR) in the decade of 1020eV, JEM-EUSO will face new challenges by applying this technique from space. The EUSO Simulation and Analysis Framework (ESAF) has been developed in this context to provide a full end-to-end simulation frame, and assess the overall performance of the detector. Within ESAF, angular reconstruction can be separated into two conceptually different steps. The first step is pattern recognition, or filtering, of the signal to separate it from the background. The second step is to perform different types of fitting in order to search for the relevant geometrical parameters that best describe the previously selected signal. In this paper, we discuss some of the techniques we have implemented in ESAF to perform the geometrical reconstruction of EAS seen by JEM-EUSO. We also conduct thorough tests to assess the performances of these techniques in conditions which are relevant to the scope of the JEM-EUSO mission. We conclude by showing the expected angular resolution in the energy range that JEM-EUSO is expected to observe.

  16. Wide-field wavefront sensing in solar adaptive optics : modeling and effects on reconstruction

    NASA Astrophysics Data System (ADS)

    Béchet, Clémentine; Tallon, Michel; Montilla, Icíar; Langlois, Maud

    2013-12-01

    The planned 4-meter diameter of the European Solar Telescope (EST) is aimed at providing high spatial resolution and large photon collecting area, in order to understand in particular the mechanisms of magnetic coupling in the chromosphere and the photosphere. To reach its goals in the visible and the near-infrared, EST is designed with both a conventional and a multi-conjugate adaptive optics (AO) of similar complexity than the ones featured for the Extremely Large Telescopes. In addition, the AO on EST has to face a particularity of solar AO: the wavefront sensing on extended sources with measurement fields of about 10'' in size. Reviewing recent literature together with an independent analysis, we investigate the impact of extended-field sensing in AO for large solar telescopes. Sensing modeling and its effect on reconstruction performance are analyzed, thanks to simulations performed with the Fractal Iterative Method for tomography (FRiM-3D), showing the difficulty to correct high altitude turbulence. We introduce a new approximate direct model of extended-source sensing which greatly improves the quality of the end-to-end simulations for EST AO. Next, we try to improve the conventional solar AO correction by using this new model in the reconstruction. Our simulations do not show significant benefits from using such tomographic model in this conventional AO configuration and under typical atmospheric conditions.

  17. [MAX-DOAS Tomography Reconstruction for Gas Plume].

    PubMed

    Wei, Min-hong; Tong, Min-ming; Li, Su-wen; Xiao, Jian-yu

    2015-08-01

    In order to achieve precisely two-dimensional spatial distribution reconstruction of smoke plume, passive MAX-DOAS tomography is established, the measurement of the spatial distribution of the exhaust plume is implemented by more passive multi-axis differential absorption spectrum system. First, the multi-axis differential absorption spectrum system and its mechanism of inverse gas concentration are introduced in the paper. Then, algebra iterative algorithm is adopted to extract the information of the trace gas concentration in reconstruction simulation with different models and different scanning optical path, and the reconstruction program is designed. Then, the numerical simulation results are compared. Finally, a platform of multi-axis differential absorption optical tomography system is set up, a field campaign was carried out. The numerical simulation results show that the MAX-DOAS tomography can accurately reconstruct two-dimensional spatial distribution of plume model, the re- construction error of MAX-DOAS tomography with four light sources is about a third of the reconstruction error with double light sources, moreover, the reconstruction time is about a quarter of the reconstruction time of double light sources, and the reconstruction error of the twin peaks model is greater than that of the one peak model. Field test results show that the integral data of reconstruction image is consistent with the measured projection data of multi-axis differential absorption spectrum, the spatial distribution reconstruction of plume is in line with the actual situation. Studies have shown that the result of numerical simulation and field test results have consistency. PMID:26672304

  18. [MAX-DOAS Tomography Reconstruction for Gas Plume].

    PubMed

    Wei, Min-hong; Tong, Min-ming; Li, Su-wen; Xiao, Jian-yu

    2015-08-01

    In order to achieve precisely two-dimensional spatial distribution reconstruction of smoke plume, passive MAX-DOAS tomography is established, the measurement of the spatial distribution of the exhaust plume is implemented by more passive multi-axis differential absorption spectrum system. First, the multi-axis differential absorption spectrum system and its mechanism of inverse gas concentration are introduced in the paper. Then, algebra iterative algorithm is adopted to extract the information of the trace gas concentration in reconstruction simulation with different models and different scanning optical path, and the reconstruction program is designed. Then, the numerical simulation results are compared. Finally, a platform of multi-axis differential absorption optical tomography system is set up, a field campaign was carried out. The numerical simulation results show that the MAX-DOAS tomography can accurately reconstruct two-dimensional spatial distribution of plume model, the re- construction error of MAX-DOAS tomography with four light sources is about a third of the reconstruction error with double light sources, moreover, the reconstruction time is about a quarter of the reconstruction time of double light sources, and the reconstruction error of the twin peaks model is greater than that of the one peak model. Field test results show that the integral data of reconstruction image is consistent with the measured projection data of multi-axis differential absorption spectrum, the spatial distribution reconstruction of plume is in line with the actual situation. Studies have shown that the result of numerical simulation and field test results have consistency.

  19. Reconstruction of missing data using iterative harmonic expansion

    NASA Astrophysics Data System (ADS)

    Nishizawa, Atsushi J.; Inoue, Kaiki Taro

    2016-10-01

    In the cosmic microwave background or galaxy density maps, missing fluctuations in masked regions can be reconstructed from fluctuations in the surrounding unmasked regions if the original fluctuations are sufficiently smooth. One reconstruction method involves applying a harmonic expansion iteratively to fluctuations in the unmasked region. In this paper, we discuss how well this reconstruction method can recover the original fluctuations depending on the prior of fluctuations and property of the masked region. The reconstruction method is formulated with an asymptotic expansion in terms of the size of mask for a fixed iteration number. The reconstruction accuracy depends on the mask size, the spectrum of the underlying density fluctuations, the scales of the fluctuations to be reconstructed and the number of iterations. For Gaussian fluctuations with the Harrison-Zel'dovich spectrum, the reconstruction method provides more accurate restoration than naive methods based on brute-force matrix inversion or the singular value decomposition. We also demonstrate that an isotropic non-Gaussian prior does not change the results but an anisotropic non-Gaussian prior can yield a higher reconstruction accuracy compared to the Gaussian prior case.

  20. Wideband digital spectrum analyzer

    NASA Technical Reports Server (NTRS)

    Morris, G. A., Jr.; Wilck, H. C.

    1979-01-01

    Modular spectrum analyzer consisting of RF receiver, fast fourier transform spectrum analyzer, and data processor samples stochastic signals in 220 channels. Construction reduces design and fabrication costs of assembled unit.

  1. Autism Spectrum Disorder (ASD)

    MedlinePlus

    ... spectrum disorder (ASD) is a group of developmental disabilities that can cause significant social, communication and behavioral ... for autism spectrum disorder (ASD) and other developmental disabilities. More E-mail Your Friends "Children with autism ...

  2. Simplified Digital Spectrum Analyzer

    NASA Technical Reports Server (NTRS)

    Cole, Steven W.

    1992-01-01

    Spectrum analyzer computes approximate cross-correlations between noisy input signal and reference signal of known frequency, yielding measure of amplitude of sinusoidal component of input. Complexity and power consumed less than other digital spectrum analyzers. Performs no multiplications, and because processes data on each frequency independently, focuses on narrow spectral range without processing data on rest of spectrum.

  3. Head and neck reconstruction

    PubMed Central

    Yadav, Prabha

    2013-01-01

    Whatever is excisable, is reconstructable! “You excise, we will reconstruct” are the confident words of reconstructive surgeons today. Reconstruction with multiple flaps has become routine. Radial artery (FRAF), Antero lateral thigh (ALT) and Fibula osteo cutaneous flap (FFOCF) are three most popular free flaps which can reconstruct any defect with excellent asthetics and performance. Radial Artery provides thin, pliable innervated skin; ALT large amount of skin & bulk; and FFOCF strong 22 to 25 centimetres of bone and reliable skin paddle. Free flap survival has gone to 98% in most of the renouned institutes and is an established escalator in management of defects. PMID:24501464

  4. Flexor pulley reconstruction.

    PubMed

    Dy, Christopher J; Daluiski, Aaron

    2013-05-01

    Flexor pulley reconstruction is a challenging surgery. Injuries often occur after traumatic lacerations or forceful extension applied to an acutely flexed finger. Surgical treatment is reserved for patients with multiple closed pulley ruptures, persistent pain, or dysfunction after attempted nonoperative management of a single pulley rupture, or during concurrent or staged flexor tendon repair or reconstruction. If the pulley cannot be repaired primarily, pulley reconstruction can be performed using graft woven into remnant pulley rim or looping graft around the phalanx. Regardless of the reconstructive technique, the surgeon should emulate the length, tension, and glide of the native pulley. PMID:23660059

  5. Attractor reconstruction for non-linear systems: a methodological note

    USGS Publications Warehouse

    Nichols, J.M.; Nichols, J.D.

    2001-01-01

    Attractor reconstruction is an important step in the process of making predictions for non-linear time-series and in the computation of certain invariant quantities used to characterize the dynamics of such series. The utility of computed predictions and invariant quantities is dependent on the accuracy of attractor reconstruction, which in turn is determined by the methods used in the reconstruction process. This paper suggests methods by which the delay and embedding dimension may be selected for a typical delay coordinate reconstruction. A comparison is drawn between the use of the autocorrelation function and mutual information in quantifying the delay. In addition, a false nearest neighbor (FNN) approach is used in minimizing the number of delay vectors needed. Results highlight the need for an accurate reconstruction in the computation of the Lyapunov spectrum and in prediction algorithms.

  6. Beam hardening correction for sparse-view CT reconstruction

    NASA Astrophysics Data System (ADS)

    Liu, Wenlei; Rong, Junyan; Gao, Peng; Liao, Qimei; Lu, HongBing

    2015-03-01

    Beam hardening, which is caused by spectrum polychromatism of the X-ray beam, may result in various artifacts in the reconstructed image and degrade image quality. The artifacts would be further aggravated for the sparse-view reconstruction due to insufficient sampling data. Considering the advantages of the total-variation (TV) minimization in CT reconstruction with sparse-view data, in this paper, we propose a beam hardening correction method for sparse-view CT reconstruction based on Brabant's modeling. In this correction model for beam hardening, the attenuation coefficient of each voxel at the effective energy is modeled and estimated linearly, and can be applied in an iterative framework, such as simultaneous algebraic reconstruction technique (SART). By integrating the correction model into the forward projector of the algebraic reconstruction technique (ART), the TV minimization can recover images when only a limited number of projections are available. The proposed method does not need prior information about the beam spectrum. Preliminary validation using Monte Carlo simulations indicates that the proposed method can provide better reconstructed images from sparse-view projection data, with effective suppression of artifacts caused by beam hardening. With appropriate modeling of other degrading effects such as photon scattering, the proposed framework may provide a new way for low-dose CT imaging.

  7. Education for Reconstruction.

    ERIC Educational Resources Information Center

    Phillips, David; And Others

    This report describes the main questions that various international agencies must address in order to reconstruct education in countries that have experienced crisis. "Crisis" is defined as war, natural disaster, and extreme political and economic upheaval. Many of the problems of educational reconstruction with which the Allies contended in…

  8. Posterolateral knee reconstruction.

    PubMed

    Djian, P

    2015-02-01

    Injury to the cruciate ligaments of the knee commonly occurs in association with posterolateral instability, which can cause severe functional disability including varus, posterior translation, and external rotational instability. Failure to diagnose and treat an injury of the posterolateral corner in a patient who has a tear of the cruciate ligament can also result in the failure of the reconstructed cruciate ligament. There seems to be a consensus of opinion that injury to the posterolateral corner, whether isolated or combined, is best treated by reconstructing the posterolateral corner along with the coexisting cruciate ligament injury, if combined. Commonly proposed methods of reconstructing the posterolateral corner have focused on the reconstruction of the popliteus, the popliteofibular ligament, and the lateral collateral ligament. The aim of this conference is to describe the posterolateral corner reconstruction technique and to provide an algorithm of treatment. PMID:25596981

  9. Accelerating Spectrum Sharing Technologies

    SciTech Connect

    Juan D. Deaton; Lynda L. Brighton; Rangam Subramanian; Hussein Moradi; Jose Loera

    2013-09-01

    Spectrum sharing potentially holds the promise of solving the emerging spectrum crisis. However, technology innovators face the conundrum of developing spectrum sharing technologies without the ability to experiment and test with real incumbent systems. Interference with operational incumbents can prevent critical services, and the cost of deploying and operating an incumbent system can be prohibitive. Thus, the lack of incumbent systems and frequency authorization for technology incubation and demonstration has stymied spectrum sharing research. To this end, industry, academia, and regulators all require a test facility for validating hypotheses and demonstrating functionality without affecting operational incumbent systems. This article proposes a four-phase program supported by our spectrum accountability architecture. We propose that our comprehensive experimentation and testing approach for technology incubation and demonstration will accelerate the development of spectrum sharing technologies.

  10. Keyhole Flap Nipple Reconstruction.

    PubMed

    Chen, Joseph I; Cash, Camille G; Iman, Al-Haj; Spiegel, Aldona J; Cronin, Ernest D

    2016-05-01

    Nipple-areola reconstruction is often one of the final but most challenging aspects of breast reconstruction. However, it is an integral and important component of breast reconstruction because it transforms the mound into a breast. We performed 133 nipple-areola reconstructions during a period of 4 years. Of these reconstructions, 76 of 133 nipple-areola complexes were reconstructed using the keyhole flap technique. The tissue used for the keyhole dermoadipose flap technique include transverse rectus abdominus myocutaneous flaps (60/76), latissimus dorsi flaps (15/76), or mastectomy skin flaps after tissue expanders (1/76). The average patient follow-up was 17 months. The design of the flap is based on a keyhole configuration. The base of the flap determines the width of the future nipple, whereas the length of the flap determines the projection. We try to match the projection of the contralateral nipple if present. The keyhole flap is simple to construct yet reliable. It provides good symmetry and projection and avoids the creation of new scars. The areola is then tattooed approximately 3 months after the nipple reconstruction. PMID:27579228

  11. Reconstruction of the Mars Science Laboratory Parachute Performance and Comparison to the Descent Simulation

    NASA Technical Reports Server (NTRS)

    Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Adams, Douglas S.; Kipp, Devin M.

    2013-01-01

    The Mars Science Laboratory used a single mortar-deployed disk-gap-band parachute of 21.35 m nominal diameter to assist in the landing of the Curiosity rover on the surface of Mars. The parachute system s performance on Mars has been reconstructed using data from the on-board inertial measurement unit, atmospheric models, and terrestrial measurements of the parachute system. In addition, the parachute performance results were compared against the end-to-end entry, descent, and landing (EDL) simulation created to design, develop, and operate the EDL system. Mortar performance was nominal. The time from mortar fire to suspension lines stretch (deployment) was 1.135 s, and the time from suspension lines stretch to first peak force (inflation) was 0.635 s. These times were slightly shorter than those used in the simulation. The reconstructed aerodynamic portion of the first peak force was 153.8 kN; the median value for this parameter from an 8,000-trial Monte Carlo simulation yielded a value of 175.4 kN - 14% higher than the reconstructed value. Aeroshell dynamics during the parachute phase of EDL were evaluated by examining the aeroshell rotation rate and rotational acceleration. The peak values of these parameters were 69.4 deg/s and 625 deg/sq s, respectively, which were well within the acceptable range. The EDL simulation was successful in predicting the aeroshell dynamics within reasonable bounds. The average total parachute force coefficient for Mach numbers below 0.6 was 0.624, which is close to the pre-flight model nominal drag coefficient of 0.615.

  12. Quantum Spread Spectrum Communication

    SciTech Connect

    Humble, Travis S

    2011-01-01

    We show that communication of single-photon quantum states in a multi-user environment is improved by using spread spectrum communication techniques. We describe a framework for spreading, transmitting, despreading, and detecting single-photon spectral states that mimics conventional spread spectrum techniques. We show in the cases of inadvertent detection, unintentional interference, and multi-user management, that quantum spread spectrum communications may minimize receiver errors by managing quantum channel access.

  13. The energy spectrum of ultra high energy cosmic rays

    NASA Astrophysics Data System (ADS)

    Abuzayyad, Tareq Ziad

    2000-11-01

    The Energy Spectrum of Ultra High Energy Cosmic Rays is measured by the first of two High Resolution Fly's Eye detectors in the monocular mode. The data set collected in the period of May 1997 to June 1999 was used for the measurement. A new reconstruction procedure (profile constrained geometry fit) was developed to analyze the data. This procedure gives reasonably good energy resolution, but poor xmax resolution. Resolution and systematics are discussed in the thesis. The spectrum measurement results are consistent with previous measurements in normalization and general shape. The spectrum appears to continue beyond the Greisen-Zatsepin-Kuz'min cutoff.

  14. Role of the illumination spatial-frequency spectrum for ptychography

    NASA Astrophysics Data System (ADS)

    Guizar-Sicairos, Manuel; Holler, Mirko; Diaz, Ana; Vila-Comamala, Joan; Bunk, Oliver; Menzel, Andreas

    2012-09-01

    We demonstrate how the spatial-frequency spectrum of the x-ray illumination affects the reconstruction signal-to-noise ratio and resolution of ptychographic imaging. The spatial-frequency spectrum of a focused x-ray probe is enhanced by partially clipping the beam with an aperture near its focus. This approach presents a simple way of enhancing the illumination spectrum without demanding extra efforts in optics fabrication, and we experimentally demonstrate that it provides an improvement in image quality and resolution.

  15. Autism spectrum disorder

    MedlinePlus

    Autism; Autistic disorder; Asperger syndrome; Childhood disintegrative disorder; Pervasive developmental disorder ... to better diagnosis and newer definitions of ASD. Autism spectrum disorder now includes syndromes that used to ...

  16. Ionospheric wave spectrum measurements

    NASA Technical Reports Server (NTRS)

    Harker, K. J.; Ilic, D. B.; Crawford, F. W.

    1979-01-01

    The local spectrum S(k, omega) of either potential or electron-density fluctuations can be used to determine macroscopic-plasma characteristics such as the local density and temperature, transport coefficients, and drift current. This local spectrum can be determined by measuring the cross-power spectrum. The paper examines the practicality of using the cross-power spectrum analyzer on the Space Shuttle to measure ionospheric parameters. Particular attention is given to investigating the integration time required to measure the cross-power spectral density to a desired accuracy.

  17. Nonlinear Simulation of the Tooth Enamel Spectrum for EPR Dosimetry

    NASA Astrophysics Data System (ADS)

    Kirillov, V. A.; Dubovsky, S. V.

    2016-07-01

    Software was developed where initial EPR spectra of tooth enamel were deconvoluted based on nonlinear simulation, line shapes and signal amplitudes in the model initial spectrum were calculated, the regression coefficient was evaluated, and individual spectra were summed. Software validation demonstrated that doses calculated using it agreed excellently with the applied radiation doses and the doses reconstructed by the method of additive doses.

  18. Acromioclavicular Joint Reconstruction.

    PubMed

    Scillia, Anthony J; Cain, E Lyle

    2015-12-01

    Our technique for acromioclavicular joint reconstruction provides a variation on coracoclavicular ligament reconstruction to also include acromioclavicular ligament reconstruction. An oblique acromial tunnel is drilled, and the medial limb of the gracilis graft, after being crossed and passed beneath the coracoid and through the clavicle, is passed through this acromial tunnel and sutured to the trapezoid graft limb after appropriate tensioning. Tenodesis screws are not placed in the bone tunnels to avoid graft fraying, and initial forces on the graft are offloaded with braided absorbable sutures passed around the clavicle. PMID:27284528

  19. Advances in Tracheal Reconstruction

    PubMed Central

    Salna, Michael; Waddell, Thomas K.; Hofer, Stefan O.

    2014-01-01

    Summary: A recent revival of global interest for reconstruction of long-segment tracheal defects, which represents one of the most interesting and complex problems in head and neck and thoracic reconstructive surgery, has been witnessed. The trachea functions as a conduit for air, and its subunits including the epithelial layer, hyaline cartilage, and segmental blood supply make it particularly challenging to reconstruct. A myriad of attempts at replacing the trachea have been described. These along with the anatomy, indications, and approaches including microsurgical tracheal reconstruction will be reviewed. Novel techniques such as tissue-engineering approaches will also be discussed. Multiple attempts at replacing the trachea with synthetic scaffolds have been met with failure. The main lesson learned from such failures is that the trachea must not be treated as a “simple tube.” Understanding the anatomy, developmental biology, physiology, and diseases affecting the trachea are required for solving this problem. PMID:25426361

  20. Overview of Image Reconstruction

    SciTech Connect

    Marr, R. B.

    1980-04-01

    Image reconstruction (or computerized tomography, etc.) is any process whereby a function, f, on Rn is estimated from empirical data pertaining to its integrals, ∫f(x) dx, for some collection of hyperplanes of dimension k < n. The paper begins with background information on how image reconstruction problems have arisen in practice, and describes some of the application areas of past or current interest; these include radioastronomy, optics, radiology and nuclear medicine, electron microscopy, acoustical imaging, geophysical tomography, nondestructive testing, and NMR zeugmatography. Then the various reconstruction algorithms are discussed in five classes: summation, or simple back-projection; convolution, or filtered back-projection; Fourier and other functional transforms; orthogonal function series expansion; and iterative methods. Certain more technical mathematical aspects of image reconstruction are considered from the standpoint of uniqueness, consistency, and stability of solution. The paper concludes by presenting certain open problems. 73 references. (RWR)

  1. Breast Reconstruction After Mastectomy

    MedlinePlus

    ... Women who have autologous tissue reconstruction may need physical therapy to help them make up for weakness experienced ... 127(1):15–22. [PubMed Abstract] Monteiro M. Physical therapy implications following the TRAM procedure. Physical Therapy. 1997; ...

  2. Reconstruction of Mandibular Defects

    PubMed Central

    Chim, Harvey; Salgado, Christopher J.; Mardini, Samir; Chen, Hung-Chi

    2010-01-01

    Defects requiring reconstruction in the mandible are commonly encountered and may result from resection of benign or malignant lesions, trauma, or osteoradionecrosis. Mandibular defects can be classified according to location and extent, as well as involvement of mucosa, skin, and tongue. Vascularized bone flaps, in general, provide the best functional and aesthetic outcome, with the fibula flap remaining the gold standard for mandible reconstruction. In this review, we discuss classification and approach to reconstruction of mandibular defects. We also elaborate upon four commonly used free osteocutaneous flaps, inclusive of fibula, iliac crest, scapula, and radial forearm. Finally, we discuss indications and use of osseointegrated implants as well as recent advances in mandibular reconstruction. PMID:22550439

  3. Breast Reconstruction and Prosthesis

    MedlinePlus

    ... feel of the breast after a mastectomy. A plastic surgeon can do it at the same time ... want breast reconstruction. • Have you talked with your plastic surgeon about your options? You may not be ...

  4. Personalized Multilayer Daily Life Profiling Through Context Enabled Activity Classification and Motion Reconstruction: An Integrated System Approach.

    PubMed

    Xu, James Y; Wang, Yan; Barrett, Mick; Dobkin, Bruce; Pottie, Greg J; Kaiser, William J

    2016-01-01

    Profiling the daily activity of a physically disabled person in the community would enable healthcare professionals to monitor the type, quantity, and quality of their patients' compliance with recommendations for exercise, fitness, and practice of skilled movements, as well as enable feedback about performance in real-world situations. Based on our early research in in-community activity profiling, we present in this paper an end-to-end system capable of reporting a patient's daily activity at multiple levels of granularity: 1) at the highest level, information on the location categories a patient is able to visit; 2) within each location category, information on the activities a patient is able to perform; and 3) at the lowest level, motion trajectory, visualization, and metrics computation of each activity. Our methodology is built upon a physical activity prescription model coupled with MEMS inertial sensors and mobile device kits that can be sent to a patient at home. A novel context-guided activity-monitoring concept with categorical location context is used to achieve enhanced classification accuracy and throughput. The methodology is then seamlessly integrated with motion reconstruction and metrics computation to provide comprehensive layered reporting of a patient's daily life. We also present an implementation of the methodology featuring a novel location context detection algorithm using WiFi augmented GPS and overlays, with motion reconstruction and visualization algorithms for practical in-community deployment. Finally, we use a series of experimental field evaluations to confirm the accuracy of the system. PMID:25546868

  5. Personalized Multilayer Daily Life Profiling Through Context Enabled Activity Classification and Motion Reconstruction: An Integrated System Approach.

    PubMed

    Xu, James Y; Wang, Yan; Barrett, Mick; Dobkin, Bruce; Pottie, Greg J; Kaiser, William J

    2016-01-01

    Profiling the daily activity of a physically disabled person in the community would enable healthcare professionals to monitor the type, quantity, and quality of their patients' compliance with recommendations for exercise, fitness, and practice of skilled movements, as well as enable feedback about performance in real-world situations. Based on our early research in in-community activity profiling, we present in this paper an end-to-end system capable of reporting a patient's daily activity at multiple levels of granularity: 1) at the highest level, information on the location categories a patient is able to visit; 2) within each location category, information on the activities a patient is able to perform; and 3) at the lowest level, motion trajectory, visualization, and metrics computation of each activity. Our methodology is built upon a physical activity prescription model coupled with MEMS inertial sensors and mobile device kits that can be sent to a patient at home. A novel context-guided activity-monitoring concept with categorical location context is used to achieve enhanced classification accuracy and throughput. The methodology is then seamlessly integrated with motion reconstruction and metrics computation to provide comprehensive layered reporting of a patient's daily life. We also present an implementation of the methodology featuring a novel location context detection algorithm using WiFi augmented GPS and overlays, with motion reconstruction and visualization algorithms for practical in-community deployment. Finally, we use a series of experimental field evaluations to confirm the accuracy of the system.

  6. Efficient reconstruction method for ground layer adaptive optics with mixed natural and laser guide stars.

    PubMed

    Wagner, Roland; Helin, Tapio; Obereder, Andreas; Ramlau, Ronny

    2016-02-20

    The imaging quality of modern ground-based telescopes such as the planned European Extremely Large Telescope is affected by atmospheric turbulence. In consequence, they heavily depend on stable and high-performance adaptive optics (AO) systems. Using measurements of incoming light from guide stars, an AO system compensates for the effects of turbulence by adjusting so-called deformable mirror(s) (DMs) in real time. In this paper, we introduce a novel reconstruction method for ground layer adaptive optics. In the literature, a common approach to this problem is to use Bayesian inference in order to model the specific noise structure appearing due to spot elongation. This approach leads to large coupled systems with high computational effort. Recently, fast solvers of linear order, i.e., with computational complexity O(n), where n is the number of DM actuators, have emerged. However, the quality of such methods typically degrades in low flux conditions. Our key contribution is to achieve the high quality of the standard Bayesian approach while at the same time maintaining the linear order speed of the recent solvers. Our method is based on performing a separate preprocessing step before applying the cumulative reconstructor (CuReD). The efficiency and performance of the new reconstructor are demonstrated using the OCTOPUS, the official end-to-end simulation environment of the ESO for extremely large telescopes. For more specific simulations we also use the MOST toolbox. PMID:26906596

  7. Free-energy landscape of mechanically unfolded model proteins: Extended Jarzinsky versus inherent structure reconstruction

    NASA Astrophysics Data System (ADS)

    Luccioli, Stefano; Imparato, Alberto; Torcini, Alessandro

    2008-09-01

    The equilibrium free-energy landscape of off-lattice model heteropolymers as a function of an internal coordinate, namely the end-to-end distance, is reconstructed from out-of-equilibrium steered molecular dynamics data. This task is accomplished via two independent methods: By employing an extended version of the Jarzynski equality and the inherent structure formalism. A comparison of the free energies estimated with these two schemes with equilibrium results obtained via the umbrella sampling technique reveals a good quantitative agreement among all the approaches in a range of temperatures around the “folding transition” for the two examined sequences. In particular, for the sequence with good foldability properties, the mechanically induced structural transitions can be related to thermodynamical aspects of folding. Moreover, for the same sequence the knowledge of the landscape profile allows for a good estimation of the lifetimes of the native configuration for temperatures ranging from the folding to the collapse temperature. For the random sequence, mechanical and thermal unfolding appear to follow different paths along the landscape.

  8. Performance comparison of wavefront reconstruction and control algorithms for Extremely Large Telescopes.

    PubMed

    Montilla, I; Béchet, C; Le Louarn, M; Reyes, M; Tallon, M

    2010-11-01

    Extremely Large Telescopes (ELTs) are very challenging with respect to their adaptive optics (AO) requirements. Their diameters and the specifications required by the astronomical science for which they are being designed imply a huge increment in the number of degrees of freedom in the deformable mirrors. Faster algorithms are needed to implement the real-time reconstruction and control in AO at the required speed. We present the results of a study of the AO correction performance of three different algorithms applied to the case of a 42-m ELT: one considered as a reference, the matrix-vector multiply (MVM) algorithm; and two considered fast, the fractal iterative method (FrIM) and the Fourier transform reconstructor (FTR). The MVM and the FrIM both provide a maximum a posteriori estimation, while the FTR provides a least-squares one. The algorithms are tested on the European Southern Observatory (ESO) end-to-end simulator, OCTOPUS. The performance is compared using a natural guide star single-conjugate adaptive optics configuration. The results demonstrate that the methods have similar performance in a large variety of simulated conditions. However, with respect to system misregistrations, the fast algorithms demonstrate an interesting robustness. PMID:21045895

  9. Performance comparison of wavefront reconstruction and control algorithms for Extremely Large Telescopes.

    PubMed

    Montilla, I; Béchet, C; Le Louarn, M; Reyes, M; Tallon, M

    2010-11-01

    Extremely Large Telescopes (ELTs) are very challenging with respect to their adaptive optics (AO) requirements. Their diameters and the specifications required by the astronomical science for which they are being designed imply a huge increment in the number of degrees of freedom in the deformable mirrors. Faster algorithms are needed to implement the real-time reconstruction and control in AO at the required speed. We present the results of a study of the AO correction performance of three different algorithms applied to the case of a 42-m ELT: one considered as a reference, the matrix-vector multiply (MVM) algorithm; and two considered fast, the fractal iterative method (FrIM) and the Fourier transform reconstructor (FTR). The MVM and the FrIM both provide a maximum a posteriori estimation, while the FTR provides a least-squares one. The algorithms are tested on the European Southern Observatory (ESO) end-to-end simulator, OCTOPUS. The performance is compared using a natural guide star single-conjugate adaptive optics configuration. The results demonstrate that the methods have similar performance in a large variety of simulated conditions. However, with respect to system misregistrations, the fast algorithms demonstrate an interesting robustness.

  10. EXSdetect: an end-to-end software for extended source detection in X-ray images: application to Swift-XRT data

    NASA Astrophysics Data System (ADS)

    Liu, T.; Tozzi, P.; Tundo, E.; Moretti, A.; Wang, J.-X.; Rosati, P.; Guglielmetti, F.

    2013-01-01

    Aims: We present a stand-alone software (named EXSdetect) for the detection of extended sources in X-ray images. Our goal is to provide a flexible tool capable of detecting extended sources down to the lowest flux levels attainable within instrumental limitations, while maintaining robust photometry, high completeness, and low contamination, regardless of source morphology. EXSdetect was developed mainly to exploit the ever-increasing wealth of archival X-ray data, but is also ideally suited to explore the scientific capabilities of future X-ray facilities, with a strong focus on investigations of distant groups and clusters of galaxies. Methods: EXSdetect combines a fast Voronoi tessellation code with a friends-of-friends algorithm and an automated deblending procedure. The values of key parameters are matched to fundamental telescope properties such as angular resolution and instrumental background. In addition, the software is designed to permit extensive tests of its performance via simulations of a wide range of observational scenarios. Results: We applied EXSdetect to simulated data fields modeled to realistically represent the Swift X-ray Cluster Survey (SXCS), which is based on archival data obtained by the X-ray telescope onboard the Swift satellite. We achieve more than 90% completeness for extended sources comprising at least 80 photons in the 0.5-2 keV band, a limit that corresponds to 10-14 erg cm-2 s-1 for the deepest SXCS fields. This detection limit is comparable to the one attained by the most sensitive cluster surveys conducted with much larger X-ray telescopes. While evaluating the performance of EXSdetect, we also explored the impact of improved angular resolution and discuss the ideal properties of the next generation of X-ray survey missions. The Phyton code EXSdetect is available on the SXCS website http://adlibitum.oats.inaf.it/sxcs

  11. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management †

    PubMed Central

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-01-01

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario. PMID:26087372

  12. End-to-End System Test of the Relative Precision and Stability of the Photometric Method for Detecting Earth-Size Extrasolar Planets

    NASA Technical Reports Server (NTRS)

    Dunham, Edward W.

    2000-01-01

    We developed the CCD camera system for the laboratory test demonstration and designed the optical system for this test. The camera system was delivered to Ames in April, 1999 with continuing support mostly in the software area as the test progressed. The camera system has been operating successfully since delivery. The optical system performed well during the test. The laboratory demonstration activity is now nearly complete and is considered to be successful by the Technical Advisory Group, which met on 8 February, 2000 at the SETI Institute. A final report for the Technical Advisory Group and NASA Headquarters will be produced in the next few months. This report will be a comprehensive report on all facets of the test including those covered under this grant. A copy will be forwarded, if desired, when it is complete.

  13. 'End to end' planktonic trophic web and its implications for the mussel farms in the Mar Piccolo of Taranto (Ionian Sea, Italy).

    PubMed

    Karuza, Ana; Caroppo, Carmela; Monti, Marina; Camatti, Elisa; Di Poi, Elena; Stabili, Loredana; Auriemma, Rocco; Pansera, Marco; Cibic, Tamara; Del Negro, Paola

    2016-07-01

    The Mar Piccolo is a semi-enclosed basin subject to different natural and anthropogenic stressors. In order to better understand plankton dynamics and preferential carbon pathways within the planktonic trophic web, an integrated approach was adopted for the first time by examining all trophic levels (virioplankton, the heterotrophic and phototrophic fractions of pico-, nano- and microplankton, as well as mesozooplankton). Plankton abundance and biomass were investigated during four surveys in the period 2013-2014. Beside unveiling the dynamics of different plankton groups in the Mar Piccolo, the study revealed that high portion of the plankton carbon (C) pool was constituted by small-sized (<2 μm) planktonic fractions. The prevalence of small-sized species within micro- and mesozooplankton communities was observed as well. The succession of planktonic communities was clearly driven by the seasonality, i.e. by the nutrient availability and physical features of the water column. Our hypothesis is that beside the 'bottom-up' control and the grazing pressure, inferred from the C pools of different plankton groups, the presence of mussel farms in the Mar Piccolo exerts a profound impact on plankton communities, not only due to the important sequestration of the plankton biomass but also by strongly influencing its structure.

  14. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    PubMed

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-06-16

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario.

  15. Robot-assisted segmental resection of tubal pregnancy followed by end-to-end reanastomosis for preserving tubal patency and fertility

    PubMed Central

    Park, Joo Hyun; Cho, SiHyun; Choi, Young Sik; Seo, Seok Kyo; Lee, Byung Seok

    2016-01-01

    Abstract The objective of this study was to evaluate whether robotic tubal reanastomosis after segmental resection of tubal pregnancy is a feasible means of preserving tubal integrity and natural fertility in those with compromised contralateral tubal condition. The study was performed at a university medical center in a retrospective manner where da Vinci robotic system-guided segmental resection of tubal ectopic mass followed by reanastomosis was performed to salvage tubal patency and fertility in those with a single viable fallopian tube. Of the 17 patients with tubal pregnancies that were selected, 14 patients with successful tubal segmental resection and reanastomosis were followed up. The reproducibility of anastomosis success and cumulative pregnancy rates of up to 24 months were analyzed. Patient mean age was 28.88 ± 4.74 years, mean amenorrheic period was 7.01 ± 1.57 weeks and mean human chorionic gonadotropin (hCG) level was 9289.00 ± 7510.00 mIU/mL. The overall intraoperative cancellation rate due to unfavorable positioning or size of the tubal mass was 17.65% (3/17), which was converted to either salpingectomy or milking of ectopic mass. Of the 14 attempted, anastomosis for all 14 cases was successful, with 1 anastomotic leakage. One patient wishing to postpone pregnancy and 2 patients where patency of the contralateral tube was confirmed during the operation, were excluded from the pregnancy outcome analysis. Cumulative pregnancy rate was 63.64% (7/11), with 3 (27.27%) ongoing pregnancies, 3 (27.27%) livebirths, and 1 missed abortion at 24 months. During the follow-up, hysterosalpingography (HSG) was performed at 6 months for those who consented, and all 10 fallopian tubes tested were patent. No subsequent tubal pregnancies occurred in the reananstomosed tube for up to a period 24 months. For patients with absent or defective contralateral tubal function, da Vinci-guided reanastomosis after segmental resection of tubal pregnancy is feasible for salvaging tubal patency and fertility. PMID:27741101

  16. Orthogonal labeling of M13 minor capsid proteins with DNA to self-assemble end-to-end multi-phage structures

    PubMed Central

    Hess, Gaelen T.; Guimaraes, Carla P.; Spooner, Eric; Ploegh, Hidde L.; Belcher, Angela M.

    2014-01-01

    M13 bacteriophage has been used as a scaffold to organize materials for various applications. Building more complex multi-phage devices requires precise control of interactions between the M13 capsid proteins. Towards this end, we engineered a loop structure onto the pIII capsid protein of M13 bacteriophage to enable sortase-mediated labeling reactions for C-terminal display. Combining this with N-terminal sortase-mediated labeling, we thus created a phage scaffold that can be labeled orthogonally on three capsid proteins: the body and both ends. We show that covalent attachment of different DNA oligonucleotides at the ends of the new phage structure enables formation of multi-phage particles oriented in a specific order. These have potential as nanoscale scaffolds for multi–material devices. PMID:23713956

  17. Pinnacle3 modeling and end-to-end dosimetric testing of a Versa HD linear accelerator with the Agility head and flattening filter-free modes.

    PubMed

    Saenz, Daniel L; Narayanasamy, Ganesh; Cruz, Wilbert; Papanikolaou, Nikos; Stathakis, Sotirios

    2016-01-01

    The Elekta Versa HD incorporates a variety of upgrades to the line of Elekta linear accelerators, primarily including the Agility head and flattening filter-free (FFF) photon beam delivery. The completely distinct dosimetric output of the head from its predecessors, combined with the FFF beams, requires a new investigation of modeling in treatment planning systems. A model was created in Pinnacle3 v9.8 with the commissioned beam data. A phantom consisting of several plastic water and Styrofoam slabs was scanned and imported into Pinnacle3, where beams of different field sizes, source-to-surface distances (SSDs), wedges, and gantry angles were devised. Beams included all of the available photon energies (6, 10, 18, 6FFF, and 10 FFF MV), as well as the four electron energies commissioned for clinical use (6, 9, 12, and 15 MeV). The plans were verified at calculation points by measurement with a calibrated ionization chamber. Homogeneous and hetero-geneous point-dose measurements agreed within 2% relative to maximum dose for all photon and electron beams. AP photon open field measurements along the central axis at 100 cm SSD passed within 1%. In addition, IMRT testing was also performed with three standard plans (step and shoot IMRT, as well as a small- and large-field VMAT plan). The IMRT plans were delivered on the Delta4 IMRT QA phantom, for which a gamma passing rate was > 99.5% for all plans with a 3% dose deviation, 3 mm distance-to-agreement, and 10% dose threshold. The IMRT QA results for the first 23 patients yielded gamma passing rates of 97.4% ± 2.3%. Such testing ensures confidence in the ability of Pinnacle3 to model photon and electron beams with the Agility head. PMID:26894352

  18. SU-E-T-508: End to End Testing of a Prototype Eclipse Module for Planning Modulated Arc Therapy On the Siemens Platform

    SciTech Connect

    Huang, L; Sarkar, V; Spiessens, S; Rassiah-Szegedi, P; Huang, Y; Salter, B; Zhao, H; Szegedi, M

    2014-06-01

    Purpose: The latest clinical implementation of the Siemens Artiste linac allows for delivery of modulated arcs (mARC) using full-field flattening filter free (FFF) photon beams. The maximum doserate of 2000 MU/min is well suited for high dose treatments such as SBRT. We tested and report on the performance of a prototype Eclipse TPS module supporting mARC capability on the Artiste platform. Method: our spine SBRT patients originally treated with 12/13 field static-gantry IMRT (SGIMRT) were chosen for this study. These plans were designed to satisfy RTOG0631 guidelines with a prescription of 16Gy in a single fraction. The cases were re-planned as mARC plans in the prototype Eclipse module using the 7MV FFF beam and required to satisfy RTOG0631 requirements. All plans were transferred from Eclipse, delivered on a Siemens Artiste linac and dose-validated using the Delta4 system. Results: All treatment plans were straightforwardly developed, in timely fashion, without challenge or inefficiency using the prototype module. Due to the limited number of segments in a single arc, mARC plans required 2-3 full arcs to yield plan quality comparable to SGIMRT plans containing over 250 total segments. The average (3%/3mm) gamma pass-rate for all arcs was 98.5±1.1%, thus demonstrating both excellent dose prediction by the AAA dose algorithm and excellent delivery fidelity. Mean delivery times for the mARC plans(10.5±1.7min) were 50-70% lower than the SGIMRT plans(26±2min), with both delivered at 2000 MU/min. Conclusion: A prototype Eclipse module capable of planning for Burst Mode modulated arc delivery on the Artiste platform has been tested and found to perform efficiently and accurately for treatment plan development and delivered-dose prediction. Further investigation of more treatment sites is being carried out and data will be presented.

  19. Resolution of end-to-end distance distributions of flexible molecules using quenching-induced variations of the Forster distance for fluorescence energy transfer.

    PubMed

    Gryczynski, I; Wiczk, W; Johnson, M L; Cheung, H C; Wang, C K; Lakowicz, J R

    1988-10-01

    We describe a new method to recover the distribution of donor-to-acceptor (D-A) distances in flexible molecules using steady-state measurements of the efficiency of fluorescence energy transfer. The method depends upon changes in the Forster distance (Ro) induced by collisional quenching of the donor emission. The Ro-dependent transfer efficiencies are analyzed using nonlinear least squares to recover the mean D-A distance and the width of the distribution. The method was developed and tested using three synthetic D-A pairs, in which the chromophores were separated by alkyl chains of varying lengths. As an example application we also recovered the distribution of distances from the single tryptophan residue in troponin I (trp 158) to acceptor-labeled cysteine 133. The half-width of the distribution increases from 12 A in the native state to 53 A when unfolded by guanidine hydrochloride. For both TnI and the three model compounds the distance distributions recovered from the steady-state transfer efficiencies were in excellent agreement with the distributions recovered using the more sophisticated frequency-domain method (Lakowicz, J.R., M.L. Johnson, W. Wiczk, A. Bhat, and R.F. Steiner. 1987. Chem. Phys. Lett. 138:587-593). The method was found to be reliable and should be generally useful for studies of conformational distributions of macromolecules. PMID:3224143

  20. From Ambient Sensing to IoT-based Context Computing: An Open Framework for End to End QoC Management.

    PubMed

    Marie, Pierrick; Desprats, Thierry; Chabridon, Sophie; Sibilla, Michelle; Taconet, Chantal

    2015-01-01

    Quality of Context (QoC) awareness is recognized as a key point for the success of context-aware computing. At the time where the combination of the Internet of Things, Cloud Computing, and Ambient Intelligence paradigms offer together new opportunities for managing richer context data, the next generation of Distributed Context Managers (DCM) is facing new challenges concerning QoC management. This paper presents our model-driven QoCIM framework. QoCIM is the acronym for Quality of Context Information Model. We show how it can help application developers to manage the whole QoC life-cycle by providing genericity, openness and uniformity. Its usages are illustrated, both at design time and at runtime, in the case of an urban pollution context- and QoC-aware scenario. PMID:26087372

  1. End-to-end crosstalk within the hepatitis C virus genome mediates the conformational switch of the 3′X-tail region

    PubMed Central

    Romero-López, Cristina; Barroso-delJesus, Alicia; García-Sacristán, Ana; Briones, Carlos; Berzal-Herranz, Alfredo

    2014-01-01

    The hepatitis C virus (HCV) RNA genome contains multiple structurally conserved domains that make long-distance RNA–RNA contacts important in the establishment of viral infection. Microarray antisense oligonucelotide assays, improved dimethyl sulfate probing methods and 2′ acylation chemistry (selective 2’-hydroxyl acylation and primer extension, SHAPE) showed the folding of the genomic RNA 3′ end to be regulated by the internal ribosome entry site (IRES) element via direct RNA–RNA interactions. The essential cis-acting replicating element (CRE) and the 3′X-tail region adopted different 3D conformations in the presence and absence of the genomic RNA 5′ terminus. Further, the structural transition in the 3′X-tail from the replication-competent conformer (consisting of three stem-loops) to the dimerizable form (with two stem-loops), was found to depend on the presence of both the IRES and the CRE elements. Complex interplay between the IRES, the CRE and the 3′X-tail region would therefore appear to occur. The preservation of this RNA–RNA interacting network, and the maintenance of the proper balance between different contacts, may play a crucial role in the switch between different steps of the HCV cycle. PMID:24049069

  2. New Tools to Discover the Physical Links From CME Eruptions to Radiation Effects in Deep Space: a First in Heliospheric End-to-End Coupling

    NASA Astrophysics Data System (ADS)

    Gorby, M. J.; Schwadron, N. A.; Linker, J. A.; Spence, H. E.; Townsend, L. W.; Cucinotta, F. A.

    2012-12-01

    We've taken fundamental new steps in physics based coupling; combining MHD simulation results with our fully 3D Lagrangian code has allowed us to attain flux and dosage rates out to 1AU. The Earth-Moon-Mars Radiation Environment Module (EMMREM) is a collection of tools based on the output of the Energetic Particle Radiation Environment Model (EPREM), which solves the focused transport equation to determine energetic particles fluxes [1]. We feed resulting flux from EPREM into the Baryon Transport (BRYNTRYN) code developed at NASA to calculate dose rates and accumulated dosages. Recently we have coupled EPREM to Magnetohydrodynamics Around a Sphere (MAS) developed at Predictive Science, Inc. [2]. The MAS / EPREM couplings allow us to accurately model the physics of evolving CMEs and their impact on the acceleration of SEPs. We detail physical regimes associated with strong and weak scattering of energetic particles near shocks, both by background magnetic field flux and self-excited waves. Results from both weak and severe SEP events will be presented, along with a comparison of the results with CRaTER and GOES data. Validation of the coupling and the implications for predicting dose rates at 1AU will also be discussed. This critical step in the evolution of code coupling enables us to explore, discover, and ultimately predict connections between SEP events and their effects on the space environment through the inner heliosphere. Thus, we present fundamental new modeling capabilities that provide critical insights into the physical causes and behavior of extreme solar events. [1] Schwadron, N. A. and A. L. Townsend, et al. (2010) Space Weather Journal, Vol. 8, S00E02. [2] Linker, J. A. and Z. Mikić, et al. (1999) J. Geophys. Res., 104(A5), 9808-9830.

  3. SU-E-J-55: End-To-End Effectiveness Analysis of 3D Surface Image Guided Voluntary Breath-Holding Radiotherapy for Left Breast

    SciTech Connect

    Lin, M; Feigenberg, S

    2015-06-15

    Purpose To evaluate the effectiveness of using 3D-surface-image to guide breath-holding (BH) left-side breast treatment. Methods Two 3D surface image guided BH procedures were implemented and evaluated: normal-BH, taking BH at a comfortable level, and deep-inspiration-breath-holding (DIBH). A total of 20 patients (10 Normal-BH and 10 DIBH) were recruited. Patients received a BH evaluation using a commercialized 3D-surface- tracking-system (VisionRT, London, UK) to quantify the reproducibility of BH positions prior to CT scan. Tangential 3D/IMRT plans were conducted. Patients were initially setup under free-breathing (FB) condition using the FB surface obtained from the untaged CT to ensure a correct patient position. Patients were then guided to reach the planned BH position using the BH surface obtained from the BH CT. Action-levels were set at each phase of treatment process based on the information provided by the 3D-surface-tracking-system for proper interventions (eliminate/re-setup/ re-coaching). We reviewed the frequency of interventions to evaluate its effectiveness. The FB-CBCT and port-film were utilized to evaluate the accuracy of 3D-surface-guided setups. Results 25% of BH candidates with BH positioning uncertainty > 2mm are eliminated prior to CT scan. For >90% of fractions, based on the setup deltas from3D-surface-trackingsystem, adjustments of patient setup are needed after the initial-setup using laser. 3D-surface-guided-setup accuracy is comparable as CBCT. For the BH guidance, frequency of interventions (a re-coaching/re-setup) is 40%(Normal-BH)/91%(DIBH) of treatments for the first 5-fractions and then drops to 16%(Normal-BH)/46%(DIBH). The necessity of re-setup is highly patient-specific for Normal-BH but highly random among patients for DIBH. Overall, a −0.8±2.4 mm accuracy of the anterior pericardial shadow position was achieved. Conclusion 3D-surface-image technology provides effective intervention to the treatment process and ensures favorable day-to-day setup accuracy. DIBH setup appears to be more uncertain and this would be the patient group who will definitely benefit from the extra information of 3D surface setup.

  4. Fission Spectrum Related Uncertainties

    SciTech Connect

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  5. Radiation detector spectrum simulator

    DOEpatents

    Wolf, Michael A.; Crowell, John M.

    1987-01-01

    A small battery operated nuclear spectrum simulator having a noise source nerates pulses with a Gaussian distribution of amplitudes. A switched dc bias circuit cooperating therewith generates several nominal amplitudes of such pulses and a spectral distribution of pulses that closely simulates the spectrum produced by a radiation source such as Americium 241.

  6. Radiation detector spectrum simulator

    DOEpatents

    Wolf, M.A.; Crowell, J.M.

    1985-04-09

    A small battery operated nuclear spectrum simulator having a noise source generates pulses with a Gaussian distribution of amplitudes. A switched dc bias circuit cooperating therewith to generate several nominal amplitudes of such pulses and a spectral distribution of pulses that closely simulates the spectrum produced by a radiation source such as Americium 241.

  7. Fetal Alcohol Spectrum Disorder

    ERIC Educational Resources Information Center

    Caley, Linda M.; Kramer, Charlotte; Robinson, Luther K.

    2005-01-01

    Fetal alcohol spectrum disorder (FASD) is a serious and widespread problem in this country. Positioned within the community with links to children, families, and healthcare systems, school nurses are a critical element in the prevention and treatment of those affected by fetal alcohol spectrum disorder. Although most school nurses are familiar…

  8. The CMBR spectrum

    SciTech Connect

    Stebbins, A.

    1997-05-01

    Here we give an introduction to the observed spectrum of the Cosmic Microwave Background Radiation (CMBR) and discuss what can be learned about it. Particular attention will be given to how Compton scattering can distort the spectrum of the CMBR. An incomplete bibliography of relevant papers is also provided.

  9. Reconstruction of dynamical pulse trains via time-resolved multiheterodyne detection.

    PubMed

    Butler, T; Tykalewicz, B; Goulding, D; Kelleher, B; Huyet, G; Hegarty, S P

    2013-12-01

    A multiheterodyne technique is presented which can accurately measure the complex spectrum and temporally reconstruct certain dynamic pulse trains. This technique is applied to periodic pulses formed in a LiNb0₃ Mach Zehnder modulator. The spectral amplitude and phase of 20 GHz 66% return-to-zero (RZ) pulses and 10 GHz 50% RZ pulses are measured, and compared to independent measurements from a high resolution optical spectrum analyser. The temporal pulse shape and phase is reconstructed and compared to high speed sampling oscilloscope measurements. This technique is applied to sections of a large single acquisition, allowing the reconstruction of frequency and amplitude modulated pulse trains. PMID:24514462

  10. Primordial power spectrum: a complete analysis with the WMAP nine-year data

    SciTech Connect

    Hazra, Dhiraj Kumar; Shafieloo, Arman; Souradeep, Tarun E-mail: arman@apctp.org

    2013-07-01

    We have improved further the error sensitive Richardson-Lucy deconvolution algorithm making it applicable directly on the un-binned measured angular power spectrum of Cosmic Microwave Background observations to reconstruct the form of the primordial power spectrum. This improvement makes the application of the method significantly more straight forward by removing some intermediate stages of analysis allowing a reconstruction of the primordial spectrum with higher efficiency and precision and with lower computational expenses. Applying the modified algorithm we fit the WMAP 9 year data using the optimized reconstructed form of the primordial spectrum with more than 300 improvement in χ{sup 2}{sub eff} with respect to the best fit power-law. This is clearly beyond the reach of other alternative approaches and reflects the efficiency of the proposed method in the reconstruction process and allow us to look for any possible feature in the primordial spectrum projected in the CMB data. Though the proposed method allow us to look at various possibilities for the form of the primordial spectrum, all having good fit to the data, proper error-analysis is needed to test for consistency of theoretical models since, along with possible physical artefacts, most of the features in the reconstructed spectrum might be arising from fitting noises in the CMB data. Reconstructed error-band for the form of the primordial spectrum using many realizations of the data, all bootstrapped and based on WMAP 9 year data, shows proper consistency of power-law form of the primordial spectrum with the WMAP 9 data at all wave numbers. Including WMAP polarization data in to the analysis have not improved much our results due to its low quality but we expect Planck data will allow us to make a full analysis on CMB observations on both temperature and polarization separately and in combination.

  11. Solar total and spectral irradiance reconstruction over last 9000 years

    NASA Astrophysics Data System (ADS)

    Wu, Chi-Ju; Usoskin, Ilya; Krivova, Natalie; Solanki, Sami K.

    2016-07-01

    Although the mechanisms of solar influence on Earth climate system are not yet fully understood, solar total and spectral irradiance are considered to be among the main determinants. Solar total irradiance is the total flux of solar radiative energy entering Earth's climate system, whereas the spectral irradiance describes this energy is distributed over the spectrum. Solar irradiance in the UV band is of special importance since it governs chemical processes in the middle and upper atmosphere. On timescales of the 11-year solar cycle and shorter, solar irradiance is measured by space-based instruments while models are needed to reconstruct solar irradiance on longer timescale. The SATIRE-M model (Spectral And Total Irradiance Reconstruction over millennia) is employed in this study to reconstruct solar irradiance from decadal radionuclide isotope data such as 14C and 10Be stored in tree rings and ice cores, respectively. A reconstruction over the last 9000 years will be presented.

  12. Image reconstruction algorithms with wavelet filtering for optoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Gawali, S.; Leggio, L.; Broadway, C.; González, P.; Sánchez, M.; Rodríguez, S.; Lamela, H.

    2016-03-01

    Optoacoustic imaging (OAI) is a hybrid biomedical imaging modality based on the generation and detection of ultrasound by illuminating the target tissue by laser light. Typically, laser light in visible or near infrared spectrum is used as an excitation source. OAI is based on the implementation of image reconstruction algorithms using the spatial distribution of optical absorption in tissues. In this work, we apply a time-domain back-projection (BP) reconstruction algorithm and a wavelet filtering for point and line detection, respectively. A comparative study between point detection and integrated line detection has been carried out by evaluating their effects on the image reconstructed. Our results demonstrate that the back-projection algorithm proposed is efficient for reconstructing high-resolution images of absorbing spheres embedded in a non-absorbing medium when it is combined with the wavelet filtering.

  13. Coracoclavicular Ligament Reconstruction

    PubMed Central

    Li, Qi; Hsueh, Pei-ling; Chen, Yun-feng

    2014-01-01

    Abstract Operative intervention is recommended for complete acromioclavicular (AC) joint dislocation to restore AC stability, but the best operative technique is still controversial. Twelve fresh-frozen male cadaveric shoulders (average age, 62.8 ± 7.8 years) were equally divided into endobutton versus the modified Weaver-Dunn groups. Each potted scapula and clavicle was fixed in a custom made jig to allow translation and load to failure testing using a Zwick BZ2.5/TS1S material testing machine (Zwick/Roell Co, Germany). A systematic review of 21 studies evaluating reconstructive methods for coracoclavicular or AC joints using a cadaveric model was also performed. From our biomechanical study, after ligament reconstruction, the triple endobutton technique demonstrated superior, anterior, and posterior displacements similar to that of the intact state (P > 0.05). In the modified Weaver-Dunn reconstruction group, however, there was significantly greater anterior (P < 0.001) and posterior (P = 0.003) translation after ligament reconstruction. In addition, there was no significant difference after reconstruction between failure load of the triple endobutton group and that of the intact state (686.88 vs 684.9 N, P > 0.05), whereas the failure load after the modified Weaver-Dunn reconstruction was decreased compared with the intact state (171.64 vs 640.86 N, P < 0.001). From our systematic review of 21 studies, which involved comparison of the modified Weaver-Dunn technique with other methods, the majority showed that the modified Weaver-Dunn procedure had significantly (P < .05) greater laxity than other methods including the endobutton technique. The triple endobutton reconstruction proved superior to the modified Weaver-Dunn technique in restoration of AC joint stability and strength. Triple endobutton reconstruction of the coracoclavicular ligament is superior to the modified Weaver-Dunn reconstruction in controlling both superior and

  14. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing.

    PubMed

    Menin, O H; Martinez, A S; Costa, A M

    2016-05-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present.

  15. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing.

    PubMed

    Menin, O H; Martinez, A S; Costa, A M

    2016-05-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present. PMID:26943902

  16. Augmented Likelihood Image Reconstruction.

    PubMed

    Stille, Maik; Kleine, Matthias; Hägele, Julian; Barkhausen, Jörg; Buzug, Thorsten M

    2016-01-01

    The presence of high-density objects remains an open problem in medical CT imaging. Data of projections passing through objects of high density, such as metal implants, are dominated by noise and are highly affected by beam hardening and scatter. Reconstructed images become less diagnostically conclusive because of pronounced artifacts that manifest as dark and bright streaks. A new reconstruction algorithm is proposed with the aim to reduce these artifacts by incorporating information about shape and known attenuation coefficients of a metal implant. Image reconstruction is considered as a variational optimization problem. The afore-mentioned prior knowledge is introduced in terms of equality constraints. An augmented Lagrangian approach is adapted in order to minimize the associated log-likelihood function for transmission CT. During iterations, temporally appearing artifacts are reduced with a bilateral filter and new projection values are calculated, which are used later on for the reconstruction. A detailed evaluation in cooperation with radiologists is performed on software and hardware phantoms, as well as on clinically relevant patient data of subjects with various metal implants. Results show that the proposed reconstruction algorithm is able to outperform contemporary metal artifact reduction methods such as normalized metal artifact reduction.

  17. Reconstruction in Warfare Injuries.

    PubMed

    Langer, V

    2010-10-01

    Traumatic injuries, especially in the combat setting, stress the surgical team that may be sited in a remote forward area, battling against paucity of time, resources and infrastructure. The lone surgeon may be faced with the arduous challenge of saving life. There is seldom thought given to reconstruction in this high-pressure situation. If the patient survives, morbidity for want of reconstruction can be severe and quality of life can suffer significantly. Reconstruction after 3 to 5 days is fraught with complications and usually does compromise outcome in the post-operative phase. The reconstructive surgeon should be involved early in the management as he can provide coverage for large soft tissue defects after aggressive debridement with panache. If the patient is haemodynamically stable, he should be transferred urgently, preferrably by air, to a higher centre with multi-specialty care, especially being equipped with an orthopaedic and trauma reconstructive surgeon. It has been proved beyond doubt that the healing improves significantly and there is marked decrease in morbidity if coverage of wounds is provided early, before colonized wounds get infected. PMID:27365741

  18. Current reconstructive management of subglottic stenosis of the larynx with reference to sixty consecutively treated cases.

    PubMed

    Couraud, L; Hafez, A; Velly, J F; Gironnet, I

    1985-10-01

    Sixty patients with subglottic stenosis of acquired and nonneoplastic origin were surgically managed by multiple open procedures. Follow-up ranged from 1 to 10 years. Fifty-seven patients had stable and excellent or good results, 2 of them after further surgery, 1 patient had to live with a retained tracheostomy indefinitely and the remaining 2 patients died. While the whole spectrum of surgical modalities employed in this series may not be recommended with total conviction, the authors express their satisfaction with single resection and end-to-end anastomosis which yields invariably good and rapidly obtainable results (22 cases with complete success). Nevertheless, laryngeal enlargement seems to be essential in the case of upper glottic lesions (19 operations provided 19 successes) while primary resection with moulding plasties may be applicable to complex and extended stenoses (19 operations: 16 successful results and 3 failures). With regard to the choice of operation, the authors emphasize the importance of careful preoperative assessment of the lesions which should assure adequate selection of therapeutic methods according to the degree of associated involvement of the trachea, glottis or supraglottic area. Conservative measures including dilatation, electro-coagulation and laser-beam surgery are considered as palliative only, however, they may be useful either in the course of the patient's preparation or in order to achieve more successful postoperative results. PMID:2416077

  19. Current reconstructive management of subglottic stenosis of the larynx with reference to sixty consecutively treated cases.

    PubMed

    Couraud, L; Hafez, A; Velly, J F; Gironnet, I

    1985-10-01

    Sixty patients with subglottic stenosis of acquired and nonneoplastic origin were surgically managed by multiple open procedures. Follow-up ranged from 1 to 10 years. Fifty-seven patients had stable and excellent or good results, 2 of them after further surgery, 1 patient had to live with a retained tracheostomy indefinitely and the remaining 2 patients died. While the whole spectrum of surgical modalities employed in this series may not be recommended with total conviction, the authors express their satisfaction with single resection and end-to-end anastomosis which yields invariably good and rapidly obtainable results (22 cases with complete success). Nevertheless, laryngeal enlargement seems to be essential in the case of upper glottic lesions (19 operations provided 19 successes) while primary resection with moulding plasties may be applicable to complex and extended stenoses (19 operations: 16 successful results and 3 failures). With regard to the choice of operation, the authors emphasize the importance of careful preoperative assessment of the lesions which should assure adequate selection of therapeutic methods according to the degree of associated involvement of the trachea, glottis or supraglottic area. Conservative measures including dilatation, electro-coagulation and laser-beam surgery are considered as palliative only, however, they may be useful either in the course of the patient's preparation or in order to achieve more successful postoperative results.

  20. Fetal Alcohol Spectrum Disorders

    MedlinePlus

    ... alcohol can cause a group of conditions called fetal alcohol spectrum disorders (FASDs). Effects can include physical and behavioral problems such ... alcohol syndrome is the most serious type of FASD. People with fetal alcohol syndrome have facial abnormalities, ...

  1. IRIS Spectrum Line Plot

    NASA Video Gallery

    This video shows a line plot of the spectrum. The spectra here are shown for various locations on the Sun. The changes in the movie are caused by differing physical conditions in the locations. Cre...

  2. Quantum Spread Spectrum Communication

    SciTech Connect

    Humble, Travis S

    2010-01-01

    We demonstrate that spectral teleportation can coherently dilate the spectral probability amplitude of a single photon. In preserving the encoded quantum information, this variant of teleportation subsequently enables a form of quantum spread spectrum communication.

  3. Spectrum (pl: spectra)

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    In general terms, the distribution of intensity of electromagnetic radiation with wavelength. Thus when we examine the spectrum of star we are looking at a map of this brightness distribution. In the context of visible light, the visible spectrum is the band of colors produced when white light is passed through a glass prism, which has the effect of spreading out light according to wavelength. Fr...

  4. Lateral Abdominal Wall Reconstruction

    PubMed Central

    Baumann, Donald P.; Butler, Charles E.

    2012-01-01

    Lateral abdominal wall (LAW) defects can manifest as a flank hernias, myofascial laxity/bulges, or full-thickness defects. These defects are quite different from those in the anterior abdominal wall defects and the complexity and limited surgical options make repairing the LAW a challenge for the reconstructive surgeon. LAW reconstruction requires an understanding of the anatomy, physiologic forces, and the impact of deinnervation injury to design and perform successful reconstructions of hernia, bulge, and full-thickness defects. Reconstructive strategies must be tailored to address the inguinal ligament, retroperitoneum, chest wall, and diaphragm. Operative technique must focus on stabilization of the LAW to nonyielding points of fixation at the anatomic borders of the LAW far beyond the musculofascial borders of the defect itself. Thus, hernias, bulges, and full-thickness defects are approached in a similar fashion. Mesh reinforcement is uniformly required in lateral abdominal wall reconstruction. Inlay mesh placement with overlying myofascial coverage is preferred as a first-line option as is the case in anterior abdominal wall reconstruction. However, interposition bridging repairs are often performed as the surrounding myofascial tissue precludes a dual layered closure. The decision to place bioprosthetic or prosthetic mesh depends on surgeon preference, patient comorbidities, and clinical factors of the repair. Regardless of mesh type, the overlying soft tissue must provide stable cutaneous coverage and obliteration of dead space. In cases where the fasciocutaneous flaps surrounding the defect are inadequate for closure, regional pedicled flaps or free flaps are recruited to achieve stable soft tissue coverage. PMID:23372458

  5. Adaptive iterative reconstruction

    NASA Astrophysics Data System (ADS)

    Bruder, H.; Raupach, R.; Sunnegardh, J.; Sedlmair, M.; Stierstorfer, K.; Flohr, T.

    2011-03-01

    It is well known that, in CT reconstruction, Maximum A Posteriori (MAP) reconstruction based on a Poisson noise model can be well approximated by Penalized Weighted Least Square (PWLS) minimization based on a data dependent Gaussian noise model. We study minimization of the PWLS objective function using the Gradient Descent (GD) method, and show that if an exact inverse of the forward projector exists, the PWLS GD update equation can be translated into an update equation which entirely operates in the image domain. In case of non-linear regularization and arbitrary noise model this means that a non-linear image filter must exist which solves the optimization problem. In the general case of non-linear regularization and arbitrary noise model, the analytical computation is not trivial and might lead to image filters which are computationally very expensive. We introduce a new iteration scheme in image space, based on a regularization filter with an anisotropic noise model. Basically, this approximates the statistical data weighting and regularization in PWLS reconstruction. If needed, e.g. for compensation of the non-exactness of backprojector, the image-based regularization loop can be preceded by a raw data based loop without regularization and statistical data weighting. We call this combined iterative reconstruction scheme Adaptive Iterative Reconstruction (AIR). It will be shown that in terms of low-contrast visibility, sharpness-to-noise and contrast-to-noise ratio, PWLS and AIR reconstruction are similar to a high degree of accuracy. In clinical images the noise texture of AIR is also superior to the more artificial texture of PWLS.

  6. Anatomic Posterolateral Corner Reconstruction.

    PubMed

    Serra Cruz, Raphael; Mitchell, Justin J; Dean, Chase S; Chahla, Jorge; Moatshe, Gilbert; LaPrade, Robert F

    2016-06-01

    Posterolateral corner injuries represent a complex injury pattern, with damage to important coronal and rotatory stabilizers of the knee. These lesions commonly occur in association with other ligament injuries, making decisions regarding treatment challenging. Grade III posterolateral corner injuries result in significant instability and have poor outcomes when treated nonoperatively. As a result, reconstruction is advocated. A thorough knowledge of the anatomy is essential for surgical treatment of this pathology. The following technical note provides a diagnostic approach, postoperative management, and details of a technique for anatomic reconstruction of the 3 main static stabilizers of the posterolateral corner of the knee. PMID:27656379

  7. Comparison of spread spectrum and pulse signal excitation for split spectrum techniques composite imaging

    NASA Astrophysics Data System (ADS)

    Svilainis, L.; Kitov, S.; Rodríguez, A.; Vergara, L.; Dumbrava, V.; Chaziachmetovas, A.

    2012-12-01

    Ultrasonic imaging of composites was investigated. Glass and carbon fiber reinforced plastic produced by resin transfer molding and prepreg forming were analyzed. In some of the samples air bubbles were trapped during RTM (resin transfer molding) process and interlayer gaps were present in prepreg technology samples. One of the most expected techniques to apply in such case is the Split Spectrum processing. On the other hand such signals require specific processing to reliably reconstruct the temporal position of the defect reflection. Correlation processing can be used for signal compression or Wiener filtering can be applied for spectral content equalisation. Pulse signals are simple to generate, but lack the possibility to alter the signal's spectrum shape. Spread spectrum signals offer a powerful tool for signal energy over frequency band increase and resolution enhancement. CW (continuous wave) burst has high energy but lacks the bandwidth needed for SSP (spread spectrum processing). The aim of the investigation was to compare the performance of the above signals in case of composite imaging, when various Split Spectrum Processing techniques are used with preceding Wiener processing for spectral content compensation. Resulting composite signals and images obtained are presented. Structural noise removal performance was evaluated as Receiver Operating Characteristics (ROC).

  8. Reconstruction of the trachea

    PubMed Central

    Grillo, Hermes C.

    1973-01-01

    Grillo, H. C. (1973).Thorax, 28, 667-679. Reconstruction of the trachea. Experience in 100 consecutive cases. Anatomic mobilization of the trachea permits resection of one-half or more with primary anastomosis. An anterior approach by a cervical or cervicomediastinal route utilizes cervical flexion to devolve the larynx and tracheal mobilization with preservation of the lateral blood supply. The transthoracic route is employed for lower tracheal lesions. Over 100 tracheal resections have been done using these methods of direct reconstruction. Eighty-four patients suffered from benign strictures, 79 resulting from intubation injuries. Eleven primary tracheal tumours and five secondary tumours are included. The majority of lesions following intubation occurred at the level of the cuff. It was possible to repair 78 of the 84 stenotic lesions through a cervical or cervicomediastinal approach. Seventy-three of the 84 patients with inflammatory lesions obtained an excellent or good functional and anatomic result. Nine of 11 patients with primary neoplasms who underwent reconstruction are alive and without known disease. There were five early postoperative deaths in these 100 consecutive patients who underwent tracheal reconstruction. Images PMID:4362789

  9. Breast reconstruction - natural tissue

    MedlinePlus

    ... muscle flap; TRAM; Latissimus muscle flap with a breast implant; DIEP flap; DIEAP flap; Gluteal free flap; ... If you are having breast reconstruction at the same time as mastectomy, the surgeon may do either of the following: Skin-sparing mastectomy. This means ...

  10. Reconstructing Community History

    ERIC Educational Resources Information Center

    Shields, Amy

    2004-01-01

    History is alive and well in Lebanon, Missouri. Students in this small town in the southwest region of the state went above and beyond the community's expectations on this special project. This article describes this historical journey which began when students in a summer mural class reconstructed a mural that was originally created by a…

  11. Micro acoustic spectrum analyzer

    DOEpatents

    Schubert, W. Kent; Butler, Michael A.; Adkins, Douglas R.; Anderson, Larry F.

    2004-11-23

    A micro acoustic spectrum analyzer for determining the frequency components of a fluctuating sound signal comprises a microphone to pick up the fluctuating sound signal and produce an alternating current electrical signal; at least one microfabricated resonator, each resonator having a different resonant frequency, that vibrate in response to the alternating current electrical signal; and at least one detector to detect the vibration of the microfabricated resonators. The micro acoustic spectrum analyzer can further comprise a mixer to mix a reference signal with the alternating current electrical signal from the microphone to shift the frequency spectrum to a frequency range that is a better matched to the resonant frequencies of the microfabricated resonators. The micro acoustic spectrum analyzer can be designed specifically for portability, size, cost, accuracy, speed, power requirements, and use in a harsh environment. The micro acoustic spectrum analyzer is particularly suited for applications where size, accessibility, and power requirements are limited, such as the monitoring of industrial equipment and processes, detection of security intrusions, or evaluation of military threats.

  12. Preparing for Breast Reconstruction Surgery

    MedlinePlus

    ... after breast reconstruction surgery Preparing for breast reconstruction surgery Your surgeon can help you know what to ... The plan for follow-up Costs Understanding your surgery costs Health insurance policies often cover most or ...

  13. Broad spectrum solar cell

    DOEpatents

    Walukiewicz, Wladyslaw; Yu, Kin Man; Wu, Junqiao; Schaff, William J.

    2007-05-15

    An alloy having a large band gap range is used in a multijunction solar cell to enhance utilization of the solar energy spectrum. In one embodiment, the alloy is In.sub.1-xGa.sub.xN having an energy bandgap range of approximately 0.7 eV to 3.4 eV, providing a good match to the solar energy spectrum. Multiple junctions having different bandgaps are stacked to form a solar cell. Each junction may have different bandgaps (realized by varying the alloy composition), and therefore be responsive to different parts of the spectrum. The junctions are stacked in such a manner that some bands of light pass through upper junctions to lower junctions that are responsive to such bands.

  14. NREL Spectrum of Innovation

    ScienceCinema

    None

    2016-07-12

    There are many voices calling for a future of abundant clean energy. The choices are difficult and the challenges daunting. How will we get there? The National Renewable Energy Laboratory integrates the entire spectrum of innovation including fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. The innovation process at NREL is interdependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.

  15. Reconstruction of signals with unknown spectra in information field theory with parameter uncertainty

    SciTech Connect

    Ensslin, Torsten A.; Frommert, Mona

    2011-05-15

    The optimal reconstruction of cosmic metric perturbations and other signals requires knowledge of their power spectra and other parameters. If these are not known a priori, they have to be measured simultaneously from the same data used for the signal reconstruction. We formulate the general problem of signal inference in the presence of unknown parameters within the framework of information field theory. To solve this, we develop a generic parameter-uncertainty renormalized estimation (PURE) technique. As a concrete application, we address the problem of reconstructing Gaussian signals with unknown power-spectrum with five different approaches: (i) separate maximum-a-posteriori power-spectrum measurement and subsequent reconstruction, (ii) maximum-a-posteriori reconstruction with marginalized power-spectrum, (iii) maximizing the joint posterior of signal and spectrum, (iv) guessing the spectrum from the variance in the Wiener-filter map, and (v) renormalization flow analysis of the field-theoretical problem providing the PURE filter. In all cases, the reconstruction can be described or approximated as Wiener-filter operations with assumed signal spectra derived from the data according to the same recipe, but with differing coefficients. All of these filters, except the renormalized one, exhibit a perception threshold in case of a Jeffreys prior for the unknown spectrum. Data modes with variance below this threshold do not affect the signal reconstruction at all. Filter (iv) seems to be similar to the so-called Karhune-Loeve and Feldman-Kaiser-Peacock estimators for galaxy power spectra used in cosmology, which therefore should also exhibit a marginal perception threshold if correctly implemented. We present statistical performance tests and show that the PURE filter is superior to the others, especially if the post-Wiener-filter corrections are included or in case an additional scale-independent spectral smoothness prior can be adopted.

  16. Unilateral Multiple Facial Nerve Branch Reconstruction Using “End-to-side Loop Graft” Supercharged by Hypoglossal Nerve

    PubMed Central

    Sasaki, Ryo; Takeuchi, Yuichi; Watanabe, Yorikatsu; Niimi, Yosuke; Sakurai, Hiroyuki; Miyata, Mariko; Yamato, Masayuki

    2014-01-01

    Background: Extensive facial nerve defects between the facial nerve trunk and its branches can be clinically reconstructed by incorporating double innervation into an end-to-side loop graft technique. This study developed a new animal model to evaluate the technique’s ability to promote nerve regeneration. Methods: Rats were divided into the intact, nonsupercharge, and supercharge groups. Artificially created facial nerve defects were reconstructed with a nerve graft, which was end-to-end sutured from proximal facial nerve stump to the mandibular branch (nonsupercharge group), or with the graft of which other end was end-to-side sutured to the hypoglossal nerve (supercharge group). And they were evaluated after 30 weeks. Results: Axonal diameter was significantly larger in the supercharge group than in the nonsupercharge group for the buccal (3.78 ± 1.68 vs 3.16 ± 1.22; P < 0.0001) and marginal mandibular branches (3.97 ± 2.31 vs 3.46 ± 1.57; P < 0.0001), but the diameter was significantly larger in the intact group for all branches except the temporal branch. In the supercharge group, compound muscle action potential amplitude was significantly higher than in the nonsupercharge group (4.18 ± 1.49 mV vs 1.87 ± 0.37 mV; P < 0.0001) and similar to that in the intact group (4.11 ± 0.68 mV). Retrograde labeling showed that the mimetic muscles were double-innervated by facial and hypoglossal nerve nuclei in the supercharge group. Conclusions: Multiple facial nerve branch reconstruction with an end-to-side loop graft was able to achieve axonal distribution. Additionally, axonal supercharge from the hypoglossal nerve significantly improved outcomes. PMID:25426357

  17. Spin-polarized gapped Dirac spectrum of unsupported silicene

    NASA Astrophysics Data System (ADS)

    Podsiadły-Paszkowska, A.; Krawiec, M.

    2016-06-01

    We study effects of the spin-orbit interaction and the atomic reconstruction of silicene on its electronic spectrum. As an example we consider unsupported silicene pulled off from Pb(111) substrate. Using first principles density functional theory we show that the inversion symmetry broken arrangement of atoms and the spin-orbit interaction generate a spin-polarized electronic spectrum with an energy gap in the Dirac cone. These findings are particularly interesting in view of the quantum anomalous and quantum valley Hall effects and should be observable in weakly interacting silicene-substrate systems.

  18. Compressed wideband spectrum sensing based on discrete cosine transform.

    PubMed

    Wang, Yulin; Zhang, Gengxin

    2014-01-01

    Discrete cosine transform (DCT) is a special type of transform which is widely used for compression of speech and image. However, its use for spectrum sensing has not yet received widespread attention. This paper aims to alleviate the sampling requirements of wideband spectrum sensing by utilizing the compressive sampling (CS) principle and exploiting the unique sparsity structure in the DCT domain. Compared with discrete Fourier transform (DFT), wideband communication signal has much sparser representation and easier implementation in DCT domain. Simulation result shows that the proposed DCT-CSS scheme outperforms the conventional DFT-CSS scheme in terms of MSE of reconstruction signal, detection probability, and computational complexity.

  19. Compressed Wideband Spectrum Sensing Based on Discrete Cosine Transform

    PubMed Central

    Wang, Yulin; Zhang, Gengxin

    2014-01-01

    Discrete cosine transform (DCT) is a special type of transform which is widely used for compression of speech and image. However, its use for spectrum sensing has not yet received widespread attention. This paper aims to alleviate the sampling requirements of wideband spectrum sensing by utilizing the compressive sampling (CS) principle and exploiting the unique sparsity structure in the DCT domain. Compared with discrete Fourier transform (DFT), wideband communication signal has much sparser representation and easier implementation in DCT domain. Simulation result shows that the proposed DCT-CSS scheme outperforms the conventional DFT-CSS scheme in terms of MSE of reconstruction signal, detection probability, and computational complexity. PMID:24526894

  20. Stochastic reconstruction of sandstones

    PubMed

    Manwart; Torquato; Hilfer

    2000-07-01

    A simulated annealing algorithm is employed to generate a stochastic model for a Berea sandstone and a Fontainebleau sandstone, with each a prescribed two-point probability function, lineal-path function, and "pore size" distribution function, respectively. We find that the temperature decrease of the annealing has to be rather quick to yield isotropic and percolating configurations. A comparison of simple morphological quantities indicates good agreement between the reconstructions and the original sandstones. Also, the mean survival time of a random walker in the pore space is reproduced with good accuracy. However, a more detailed investigation by means of local porosity theory shows that there may be significant differences of the geometrical connectivity between the reconstructed and the experimental samples.

  1. Reconstruction of images from radiofrequency electron paramagnetic resonance spectra.

    PubMed

    Smith, C M; Stevens, A D

    1994-12-01

    This paper discusses methods for obtaining image reconstructions from electron paramagnetic resonance (EPR) spectra which constitute object projections. An automatic baselining technique is described which treats each spectrum consistently; rotating the non-horizontal baselines which are caused by stray magnetic effects onto the horizontal axis. The convolved backprojection method is described for both two- and three-dimensional reconstruction and the effect of cut-off frequency on the reconstruction is illustrated. A slower, indirect, iterative method, which does a non-linear fit to the projection data, is shown to give a far smoother reconstructed image when the method of maximum entropy is used to determine the value of the final residual sum of squares. Although this requires more computing time than the convolved backprojection method, it is more flexible and overcomes the problem of numerical instability encountered in deconvolution. Images from phantom samples in vitro are discussed. The spectral data for these have been accumulated quickly and have a low signal-to-noise ratio. The results show that as few as 16 spectra can still be processed to give an image. Artifacts in the image due to a small number of projections using the convolved backprojection reconstruction method can be removed by applying a threshold, i.e. only plotting contours higher than a given value. These artifacts are not present in an image which has been reconstructed by the maximum entropy technique. At present these techniques are being applied directly to in vivo studies.

  2. Improved Diffusion Imaging through SNR-Enhancing Joint Reconstruction

    PubMed Central

    Haldar, Justin P.; Wedeen, Van J.; Nezamzadeh, Marzieh; Dai, Guangping; Weiner, Michael W.; Schuff, Norbert; Liang, Zhi-Pei

    2012-01-01

    Quantitative diffusion imaging is a powerful technique for the characterization of complex tissue microarchitecture. However, long acquisition times and limited signal-to-noise ratio (SNR) represent significant hurdles for many in vivo applications. This paper presents a new approach to reduce noise while largely maintaining resolution in diffusion weighted images, using a statistical reconstruction method that takes advantage of the high level of structural correlation observed in typical datasets. Compared to existing denoising methods, the proposed method performs reconstruction directly from the measured complex k-space data, allowing for Gaussian noise modeling and theoretical characterizations of the resolution and SNR of the reconstructed images. In addition, the proposed method is compatible with many different models of the diffusion signal (e.g., diffusion tensor modeling, q-space modeling, etc.). The joint reconstruction method can provide significant improvements in SNR relative to conventional reconstruction techniques, with a relatively minor corresponding loss in image resolution. Results are shown in the context of diffusion spectrum imaging tractography and diffusion tensor imaging, illustrating the potential of this SNR-enhancing joint reconstruction approach for a range of different diffusion imaging experiments. PMID:22392528

  3. LOFAR sparse image reconstruction

    NASA Astrophysics Data System (ADS)

    Garsden, H.; Girard, J. N.; Starck, J. L.; Corbel, S.; Tasse, C.; Woiselle, A.; McKean, J. P.; van Amesfoort, A. S.; Anderson, J.; Avruch, I. M.; Beck, R.; Bentum, M. J.; Best, P.; Breitling, F.; Broderick, J.; Brüggen, M.; Butcher, H. R.; Ciardi, B.; de Gasperin, F.; de Geus, E.; de Vos, M.; Duscha, S.; Eislöffel, J.; Engels, D.; Falcke, H.; Fallows, R. A.; Fender, R.; Ferrari, C.; Frieswijk, W.; Garrett, M. A.; Grießmeier, J.; Gunst, A. W.; Hassall, T. E.; Heald, G.; Hoeft, M.; Hörandel, J.; van der Horst, A.; Juette, E.; Karastergiou, A.; Kondratiev, V. I.; Kramer, M.; Kuniyoshi, M.; Kuper, G.; Mann, G.; Markoff, S.; McFadden, R.; McKay-Bukowski, D.; Mulcahy, D. D.; Munk, H.; Norden, M. J.; Orru, E.; Paas, H.; Pandey-Pommier, M.; Pandey, V. N.; Pietka, G.; Pizzo, R.; Polatidis, A. G.; Renting, A.; Röttgering, H.; Rowlinson, A.; Schwarz, D.; Sluman, J.; Smirnov, O.; Stappers, B. W.; Steinmetz, M.; Stewart, A.; Swinbank, J.; Tagger, M.; Tang, Y.; Tasse, C.; Thoudam, S.; Toribio, C.; Vermeulen, R.; Vocks, C.; van Weeren, R. J.; Wijnholds, S. J.; Wise, M. W.; Wucknitz, O.; Yatawatta, S.; Zarka, P.; Zensus, A.

    2015-03-01

    Context. The LOw Frequency ARray (LOFAR) radio telescope is a giant digital phased array interferometer with multiple antennas distributed in Europe. It provides discrete sets of Fourier components of the sky brightness. Recovering the original brightness distribution with aperture synthesis forms an inverse problem that can be solved by various deconvolution and minimization methods. Aims: Recent papers have established a clear link between the discrete nature of radio interferometry measurement and the "compressed sensing" (CS) theory, which supports sparse reconstruction methods to form an image from the measured visibilities. Empowered by proximal theory, CS offers a sound framework for efficient global minimization and sparse data representation using fast algorithms. Combined with instrumental direction-dependent effects (DDE) in the scope of a real instrument, we developed and validated a new method based on this framework. Methods: We implemented a sparse reconstruction method in the standard LOFAR imaging tool and compared the photometric and resolution performance of this new imager with that of CLEAN-based methods (CLEAN and MS-CLEAN) with simulated and real LOFAR data. Results: We show that i) sparse reconstruction performs as well as CLEAN in recovering the flux of point sources; ii) performs much better on extended objects (the root mean square error is reduced by a factor of up to 10); and iii) provides a solution with an effective angular resolution 2-3 times better than the CLEAN images. Conclusions: Sparse recovery gives a correct photometry on high dynamic and wide-field images and improved realistic structures of extended sources (of simulated and real LOFAR datasets). This sparse reconstruction method is compatible with modern interferometric imagers that handle DDE corrections (A- and W-projections) required for current and future instruments such as LOFAR and SKA.

  4. Kinky tomographic reconstruction

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.; Bilisoly, R.L.

    1996-05-01

    We address the issue of how to make decisions about the degree of smoothness demanded of a flexible contour used to model the boundary of a 2D object. We demonstrate the use of a Bayesian approach to set the strength of the smoothness prior for a tomographic reconstruction problem. The Akaike Information Criterion is used to determine whether to allow a kink in the contour.

  5. Sinclair ZX Spectrum.

    ERIC Educational Resources Information Center

    Rodwell, Peter

    1982-01-01

    Describes and evaluates the hardware, software, peripheral devices, performance capabilities, and programing capacity of the Sinclair ZX Spectrum microcomputer. The computer's display system, its version of the BASIC programing language, its graphics capabilities, and the unique features of its data entry keyboard are discussed. (JL)

  6. Charging for Spectrum Use.

    ERIC Educational Resources Information Center

    Geller, Henry; Lampert, Donna

    This paper, the third in a series exploring future options for public policy in the communications and information arenas, argues that the communications spectrum--e.g., public mobile service, private radio, and domestic satellites--is a valuable but limited resource that should benefit all Americans. After a background discussion, it is…

  7. Stellar Spectrum Synthesizer

    ERIC Educational Resources Information Center

    Landegren, G. F.

    1975-01-01

    Describes a device which employs two diffraction gratings and three or four simple lenses to produce arbitrary absorption or emission spectra that may be doppler shifted and spectroscopically examined by students some distance away. It may be regarded as a sort of artificial star whose spectrum may be analyzed as an undergraduate laboratory…

  8. Enhancement of low-quality reconstructed digital hologram images based on frequency extrapolation of large objects under the diffraction limit

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Li, Weiliang; Zhao, Dongxue

    2016-06-01

    During the reconstruction of a digital hologram, the reconstructed image is usually degraded by speckle noise, which makes it hard to observe the original object pattern. In this paper, a new reconstructed image enhancement method is proposed, which first reduces the speckle noise using an adaptive Gaussian filter, then calculates the high frequencies that belong to the object pattern based on a frequency extrapolation strategy. The proposed frequency extrapolation first calculates the frequency spectrum of the Fourier-filtered image, which is originally reconstructed from the +1 order of the hologram, and then gives the initial parameters for an iterative solution. The analytic iteration is implemented by continuous gradient threshold convergence to estimate the image level and vertical gradient information. The predicted spectrum is acquired through the analytical iteration of the original spectrum and gradient spectrum analysis. Finally, the reconstructed spectrum of the restoration image is acquired from the synthetic correction of the original spectrum using the predicted gradient spectrum. We conducted our experiment very close to the diffraction limit and used low-quality equipment to prove the feasibility of our method. Detailed analysis and figure demonstrations are presented in the paper.

  9. Validation of measurement-guided 3D VMAT dose reconstruction on a heterogeneous anthropomorphic phantom.

    PubMed

    Opp, Daniel; Nelms, Benjamin E; Zhang, Geoffrey; Stevens, Craig; Feygelman, Vladimir

    2013-01-01

    3DVH software (Sun Nuclear Corp., Melbourne, FL) is capable of generating a volumetric patient VMAT dose by applying a volumetric perturbation algorithm based on comparing measurement-guided dose reconstruction and TPS-calculated dose to a cylindrical phantom. The primary purpose of this paper is to validate this dose reconstruction on an anthropomorphic heterogeneous thoracic phantom by direct comparison to independent measurements. The dosimetric insert to the phantom is novel, and thus the secondary goal is to demonstrate how it can be used for the hidden target end-to-end testing of VMAT treatments in lung. A dosimetric insert contains a 4 cm diameter unit-density spherical target located inside the right lung (0.21 g/cm(3) density). It has 26 slots arranged in two orthogonal directions, milled to hold optically stimulated luminescent dosimeters (OSLDs). Dose profiles in three cardinal orthogonal directions were obtained for five VMAT plans with varying degrees of modulation. After appropriate OSLD corrections were applied, 3DVH measurement-guided VMAT dose reconstruction agreed 100% with the measurements in the unit density target sphere at 3%/3 mm level (composite analysis) for all profile points for the four less-modulated VMAT plans, and for 96% of the points in the highly modulated C-shape plan (from TG-119). For this latter plan, while 3DVH shows acceptable agreement with independent measurements in the unit density target, in the lung disagreement with experiment is relatively high for both the TPS calculation and 3DVH reconstruction. For the four plans excluding the C-shape, 3%/3 mm overall composite analysis passing rates for 3DVH against independent measurement ranged from 93% to 100%. The C-shape plan was deliberately chosen as a stress test of the algorithm. The dosimetric spatial alignment hidden target test demonstrated the average distance to agreement between the measured and TPS profiles in the steep dose gradient area at the edge of the 2 cm

  10. [EMD Time-Frequency Analysis of Raman Spectrum and NIR].

    PubMed

    Zhao, Xiao-yu; Fang, Yi-ming; Tan, Feng; Tong, Liang; Zhai, Zhe

    2016-02-01

    This paper analyzes the Raman spectrum and Near Infrared Spectrum (NIR) with time-frequency method. The empirical mode decomposition spectrum becomes intrinsic mode functions, which the proportion calculation reveals the Raman spectral energy is uniform distributed in each component, while the NIR's low order intrinsic mode functions only undertakes fewer primary spectroscopic effective information. Both the real spectrum and numerical experiments show that the empirical mode decomposition (EMD) regard Raman spectrum as the amplitude-modulated signal, which possessed with high frequency adsorption property; and EMD regards NIR as the frequency-modulated signal, which could be preferably realized high frequency narrow-band demodulation during first-order intrinsic mode functions. The first-order intrinsic mode functions Hilbert transform reveals that during the period of empirical mode decomposes Raman spectrum, modal aliasing happened. Through further analysis of corn leaf's NIR in time-frequency domain, after EMD, the first and second orders components of low energy are cut off, and reconstruct spectral signal by using the remaining intrinsic mode functions, the root-mean-square error is 1.001 1, and the correlation coefficient is 0.981 3, both of these two indexes indicated higher accuracy in re-construction; the decomposition trend term indicates the absorbency is ascending along with the decreasing to wave length in the near-infrared light wave band; and the Hilbert transform of characteristic modal component displays, 657 cm⁻¹ is the specific frequency by the corn leaf stress spectrum, which could be regarded as characteristic frequency for identification. PMID:27209743

  11. [EMD Time-Frequency Analysis of Raman Spectrum and NIR].

    PubMed

    Zhao, Xiao-yu; Fang, Yi-ming; Tan, Feng; Tong, Liang; Zhai, Zhe

    2016-02-01

    This paper analyzes the Raman spectrum and Near Infrared Spectrum (NIR) with time-frequency method. The empirical mode decomposition spectrum becomes intrinsic mode functions, which the proportion calculation reveals the Raman spectral energy is uniform distributed in each component, while the NIR's low order intrinsic mode functions only undertakes fewer primary spectroscopic effective information. Both the real spectrum and numerical experiments show that the empirical mode decomposition (EMD) regard Raman spectrum as the amplitude-modulated signal, which possessed with high frequency adsorption property; and EMD regards NIR as the frequency-modulated signal, which could be preferably realized high frequency narrow-band demodulation during first-order intrinsic mode functions. The first-order intrinsic mode functions Hilbert transform reveals that during the period of empirical mode decomposes Raman spectrum, modal aliasing happened. Through further analysis of corn leaf's NIR in time-frequency domain, after EMD, the first and second orders components of low energy are cut off, and reconstruct spectral signal by using the remaining intrinsic mode functions, the root-mean-square error is 1.001 1, and the correlation coefficient is 0.981 3, both of these two indexes indicated higher accuracy in re-construction; the decomposition trend term indicates the absorbency is ascending along with the decreasing to wave length in the near-infrared light wave band; and the Hilbert transform of characteristic modal component displays, 657 cm⁻¹ is the specific frequency by the corn leaf stress spectrum, which could be regarded as characteristic frequency for identification.

  12. Measuring the activity of a {sup 51}Cr neutrino source based on the gamma-radiation spectrum

    SciTech Connect

    Gorbachev, V. V. Gavrin, V. N.; Ibragimova, T. V.; Kalikhov, A. V.; Malyshkin, Yu. M.; Shikhin, A. A.

    2015-12-15

    A technique for the measurement of activities of intense β sources by measuring the continuous gamma-radiation (internal bremsstrahlung) spectra is developed. A method for reconstructing the spectrum recorded by a germanium semiconductor detector is described. A method for the absolute measurement of the internal bremsstrahlung spectrum of {sup 51}Cr is presented.

  13. Microwave assisted reconstruction of optical interferograms for distributed fiber optic sensing.

    PubMed

    Huang, Jie; Hua, Lei; Lan, Xinwei; Wei, Tao; Xiao, Hai

    2013-07-29

    This paper reports a distributed fiber optic sensing technique through microwave assisted separation and reconstruction of optical interferograms in spectrum domain. The approach involves sending a microwave-modulated optical signal through cascaded fiber optic interferometers. The microwave signal was used to resolve the position and reflectivity of each sensor along the optical fiber. By sweeping the optical wavelength and detecting the modulation signal, the optical spectrum of each sensor can be reconstructed. Three cascaded fiber optic extrinsic Fabry-Perot interferometric sensors were used to prove the concept. Their microwave-reconstructed interferogram matched well with those recorded individually using an optical spectrum analyzer. The application in distributed strain measurement has also been demonstrated. PMID:23938685

  14. Spread spectrum image steganography.

    PubMed

    Marvel, L M; Boncelet, C R; Retter, C T

    1999-01-01

    In this paper, we present a new method of digital steganography, entitled spread spectrum image steganography (SSIS). Steganography, which means "covered writing" in Greek, is the science of communicating in a hidden manner. Following a discussion of steganographic communication theory and review of existing techniques, the new method, SSIS, is introduced. This system hides and recovers a message of substantial length within digital imagery while maintaining the original image size and dynamic range. The hidden message can be recovered using appropriate keys without any knowledge of the original image. Image restoration, error-control coding, and techniques similar to spread spectrum are described, and the performance of the system is illustrated. A message embedded by this method can be in the form of text, imagery, or any other digital signal. Applications for such a data-hiding scheme include in-band captioning, covert communication, image tamperproofing, authentication, embedded control, and revision tracking.

  15. X-ray spectrum estimation from transmission measurements by an exponential of a polynomial model

    NASA Astrophysics Data System (ADS)

    Perkhounkov, Boris; Stec, Jessika; Sidky, Emil Y.; Pan, Xiaochuan

    2016-04-01

    There has been much recent research effort directed toward spectral computed tomography (CT). An important step in realizing spectral CT is determining the spectral response of the scanning system so that the relation between material thicknesses and X-ray transmission intensity is known. We propose a few parameter spectrum model that can accurately model the X-ray transmission curves and has a form which is amenable to simultaneous spectral CT image reconstruction and CT system spectrum calibration. While the goal is to eventually realize the simultaneous image reconstruction/spectrum estimation algorithm, in this work we investigate the effectiveness of the model on spectrum estimation from simulated transmission measurements through known thicknesses of known materials. The simulated transmission measurements employ a typical X-ray spectrum used for CT and contain noise due to the randomness in detecting finite numbers of photons. The proposed model writes the X-ray spectrum as the exponential of a polynomial (EP) expansion. The model parameters are obtained by use of a standard software implementation of the Nelder-Mead simplex algorithm. The performance of the model is measured by the relative error between the predicted and simulated transmission curves. The estimated spectrum is also compared with the model X-ray spectrum. For reference, we also employ a polynomial (P) spectrum model and show performance relative to the proposed EP model.

  16. Sensors across the Spectrum

    NASA Astrophysics Data System (ADS)

    Neese, Christopher F.; De Lucia, Frank C.; Medvedev, Ivan R.

    2011-06-01

    A resurgence of interest in spectroscopic sensors has been fueled by increases in performance made possible by technological advancements and applications in medicine, environmental monitoring, and national security. Often this research is technology driven, without enough consideration of the spectroscopic signatures available to be probed. We will compare several current spectroscopic sensors across the electromagnetic spectrum, with an eye towards the fundamental spectroscopic considerations important at each wavelength.

  17. The marine diversity spectrum.

    PubMed

    Reuman, Daniel C; Gislason, Henrik; Barnes, Carolyn; Mélin, Frédéric; Jennings, Simon

    2014-07-01

    Distributions of species body sizes within a taxonomic group, for example, mammals, are widely studied and important because they help illuminate the evolutionary processes that produced these distributions. Distributions of the sizes of species within an assemblage delineated by geography instead of taxonomy (all the species in a region regardless of clade) are much less studied but are equally important and will illuminate a different set of ecological and evolutionary processes. We develop and test a mechanistic model of how diversity varies with body mass in marine ecosystems. The model predicts the form of the 'diversity spectrum', which quantifies the distribution of species' asymptotic body masses, is a species analogue of the classic size spectrum of individuals, and which we have found to be a new and widely applicable description of diversity patterns. The marine diversity spectrum is predicted to be approximately linear across an asymptotic mass range spanning seven orders of magnitude. Slope -0.5 is predicted for the global marine diversity spectrum for all combined pelagic zones of continental shelf seas, and slopes for large regions are predicted to lie between -0.5 and -0.1. Slopes of -0.5 and -0.1 represent markedly different communities: a slope of -0.5 depicts a 10-fold reduction in diversity for every 100-fold increase in asymptotic mass; a slope of -0.1 depicts a 1.6-fold reduction. Steeper slopes are predicted for larger or colder regions, meaning fewer large species per small species for such regions. Predictions were largely validated by a global empirical analysis. Results explain for the first time a new and widespread phenomenon of biodiversity. Results have implications for estimating numbers of species of small asymptotic mass, where taxonomic inventories are far from complete. Results show that the relationship between diversity and body mass can be explained from the dependence of predation behaviour, dispersal, and life history on

  18. The marine diversity spectrum.

    PubMed

    Reuman, Daniel C; Gislason, Henrik; Barnes, Carolyn; Mélin, Frédéric; Jennings, Simon

    2014-07-01

    Distributions of species body sizes within a taxonomic group, for example, mammals, are widely studied and important because they help illuminate the evolutionary processes that produced these distributions. Distributions of the sizes of species within an assemblage delineated by geography instead of taxonomy (all the species in a region regardless of clade) are much less studied but are equally important and will illuminate a different set of ecological and evolutionary processes. We develop and test a mechanistic model of how diversity varies with body mass in marine ecosystems. The model predicts the form of the 'diversity spectrum', which quantifies the distribution of species' asymptotic body masses, is a species analogue of the classic size spectrum of individuals, and which we have found to be a new and widely applicable description of diversity patterns. The marine diversity spectrum is predicted to be approximately linear across an asymptotic mass range spanning seven orders of magnitude. Slope -0.5 is predicted for the global marine diversity spectrum for all combined pelagic zones of continental shelf seas, and slopes for large regions are predicted to lie between -0.5 and -0.1. Slopes of -0.5 and -0.1 represent markedly different communities: a slope of -0.5 depicts a 10-fold reduction in diversity for every 100-fold increase in asymptotic mass; a slope of -0.1 depicts a 1.6-fold reduction. Steeper slopes are predicted for larger or colder regions, meaning fewer large species per small species for such regions. Predictions were largely validated by a global empirical analysis. Results explain for the first time a new and widespread phenomenon of biodiversity. Results have implications for estimating numbers of species of small asymptotic mass, where taxonomic inventories are far from complete. Results show that the relationship between diversity and body mass can be explained from the dependence of predation behaviour, dispersal, and life history on

  19. UV spectrum of Enceladus

    NASA Astrophysics Data System (ADS)

    Zastrow, Mark; Clarke, John T.; Hendrix, Amanda R.; Noll, Keith S.

    2012-07-01

    We present a far ultraviolet (FUV) spectrum of Saturn’s moon Enceladus from the Cosmic Origins Spectrograph (COS) on the Hubble Space Telescope (HST). We have put upper limits on emission from C, N, and O lines in Enceladus’ atmosphere and column densities for the C lines assuming solar resonance scattering. We find these upper limits to be relatively low-on the order of tens to thousands of Rayleighs and with C column densities on the order of 108-1015 cm-2, depending on the assumed source size. We also present a segment of a reflectance spectrum in the FUV from ∼1900-2130 Å. This region was sensitive to the different ice mixtures in the model spectra reported by Hendrix et al. (Hendrix, A.R., Hansen, C.J., Holsclaw, G.M. [2010]. Icarus, 206, 608). We find the spectrum brightens quickly longward of ∼1900 Å, constraining the absorption band observed by Hendrix et al. from ∼170 to 190 nm. We find our data is consistent with the suggestion of Hendrix et al. of the presence of ammonia ice (or ammonia hydrate) to darken that region, and also possibly tholins to darken the mid-UV, as reported by Verbiscer et al. (Verbiscer, A.J., French, R.G., McGhee, C.A. [2005]. Icarus, 173, 66).

  20. Evolutionary tree reconstruction

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Kanefsky, Bob

    1990-01-01

    It is described how Minimum Description Length (MDL) can be applied to the problem of DNA and protein evolutionary tree reconstruction. If there is a set of mutations that transform a common ancestor into a set of the known sequences, and this description is shorter than the information to encode the known sequences directly, then strong evidence for an evolutionary relationship has been found. A heuristic algorithm is described that searches for the simplest tree (smallest MDL) that finds close to optimal trees on the test data. Various ways of extending the MDL theory to more complex evolutionary relationships are discussed.

  1. Reconstructing the Antikythera Mechanism

    NASA Astrophysics Data System (ADS)

    Freeth, Tony

    The Antikythera Mechanism is a geared astronomical calculating machine from ancient Greece. The extraordinary nature of this device has become even more apparent in recent years as a result of research under the aegis of the Antikythera Mechanism Research Project (AMRP) - an international collaboration of scientists, historians, museum staff, engineers, and imaging specialists. Though many questions still remain, we may now be close to reconstructing the complete machine. As a technological artifact, it is unique in the ancient world. Its brilliant design conception means that it is a landmark in the history of science and technology.

  2. Photometric Lunar Surface Reconstruction

    NASA Technical Reports Server (NTRS)

    Nefian, Ara V.; Alexandrov, Oleg; Morattlo, Zachary; Kim, Taemin; Beyer, Ross A.

    2013-01-01

    Accurate photometric reconstruction of the Lunar surface is important in the context of upcoming NASA robotic missions to the Moon and in giving a more accurate understanding of the Lunar soil composition. This paper describes a novel approach for joint estimation of Lunar albedo, camera exposure time, and photometric parameters that utilizes an accurate Lunar-Lambertian reflectance model and previously derived Lunar topography of the area visualized during the Apollo missions. The method introduced here is used in creating the largest Lunar albedo map (16% of the Lunar surface) at the resolution of 10 meters/pixel.

  3. Waveform reconstruction for an ultrasonic fiber Bragg grating sensor demodulated by an erbium fiber laser.

    PubMed

    Wu, Qi; Okabe, Yoji

    2015-02-01

    Fiber Bragg grating (FBG) demodulated by an erbium fiber laser (EFL) has been used for ultrasonic detection recently. However, due to the inherent relaxation oscillation (RO) of the EFL, the detected ultrasonic signals have large deformations, especially in the low-frequency range. We proposed a novel data processing method to reconstruct an actual ultrasonic waveform. The noise spectrum was smoothed first; the actual ultrasonic spectrum was then obtained by deconvolution in order to mitigate the influence of the RO of the EFL. We proved by experiment that this waveform reconstruction method has high precision, and demonstrated that the FBG sensor demodulated by the EFL will have large practical applications in nondestructive testing.

  4. Biomaterials for craniofacial reconstruction

    PubMed Central

    Neumann, Andreas; Kevenhoerster, Kevin

    2011-01-01

    Biomaterials for reconstruction of bony defects of the skull comprise of osteosynthetic materials applied after osteotomies or traumatic fractures and materials to fill bony defects which result from malformation, trauma or tumor resections. Other applications concern functional augmentations for dental implants or aesthetic augmentations in the facial region. For ostheosynthesis, mini- and microplates made from titanium alloys provide major advantages concerning biocompatibility, stability and individual fitting to the implant bed. The necessity of removing asymptomatic plates and screws after fracture healing is still a controversial issue. Risks and costs of secondary surgery for removal face a low rate of complications (due to corrosion products) when the material remains in situ. Resorbable osteosynthesis systems have similar mechanical stability and are especially useful in the growing skull. The huge variety of biomaterials for the reconstruction of bony defects makes it difficult to decide which material is adequate for which indication and for which site. The optimal biomaterial that meets every requirement (e.g. biocompatibility, stability, intraoperative fitting, product safety, low costs etc.) does not exist. The different material types are (autogenic) bone and many alloplastics such as metals (mainly titanium), ceramics, plastics and composites. Future developments aim to improve physical and biological properties, especially regarding surface interactions. To date, tissue engineered bone is far from routine clinical application. PMID:22073101

  5. Parallel ptychographic reconstruction

    PubMed Central

    Nashed, Youssef S. G.; Vine, David J.; Peterka, Tom; Deng, Junjing; Ross, Rob; Jacobsen, Chris

    2014-01-01

    Ptychography is an imaging method whereby a coherent beam is scanned across an object, and an image is obtained by iterative phasing of the set of diffraction patterns. It is able to be used to image extended objects at a resolution limited by scattering strength of the object and detector geometry, rather than at an optics-imposed limit. As technical advances allow larger fields to be imaged, computational challenges arise for reconstructing the correspondingly larger data volumes, yet at the same time there is also a need to deliver reconstructed images immediately so that one can evaluate the next steps to take in an experiment. Here we present a parallel method for real-time ptychographic phase retrieval. It uses a hybrid parallel strategy to divide the computation between multiple graphics processing units (GPUs) and then employs novel techniques to merge sub-datasets into a single complex phase and amplitude image. Results are shown on a simulated specimen and a real dataset from an X-ray experiment conducted at a synchrotron light source. PMID:25607174

  6. Synchronized dynamic dose reconstruction

    SciTech Connect

    Litzenberg, Dale W.; Hadley, Scott W.; Tyagi, Neelam; Balter, James M.; Ten Haken, Randall K.; Chetty, Indrin J.

    2007-01-15

    Variations in target volume position between and during treatment fractions can lead to measurable differences in the dose distribution delivered to each patient. Current methods to estimate the ongoing cumulative delivered dose distribution make idealized assumptions about individual patient motion based on average motions observed in a population of patients. In the delivery of intensity modulated radiation therapy (IMRT) with a multi-leaf collimator (MLC), errors are introduced in both the implementation and delivery processes. In addition, target motion and MLC motion can lead to dosimetric errors from interplay effects. All of these effects may be of clinical importance. Here we present a method to compute delivered dose distributions for each treatment beam and fraction, which explicitly incorporates synchronized real-time patient motion data and real-time fluence and machine configuration data. This synchronized dynamic dose reconstruction method properly accounts for the two primary classes of errors that arise from delivering IMRT with an MLC: (a) Interplay errors between target volume motion and MLC motion, and (b) Implementation errors, such as dropped segments, dose over/under shoot, faulty leaf motors, tongue-and-groove effect, rounded leaf ends, and communications delays. These reconstructed dose fractions can then be combined to produce high-quality determinations of the dose distribution actually received to date, from which individualized adaptive treatment strategies can be determined.

  7. Stereoscopic liver surface reconstruction

    PubMed Central

    Karwan, Adam; Rudnicki, Jerzy; Wróblewski, Tadeusz

    2012-01-01

    The paper presents a practical approach to measuring liver motion, both respiratory and laparoscopic, with a tool guided in the operating room. The presented method is based on standard operating room equipment, i.e. rigid laparoscopic cameras and a single incision laparoscopic surgery trocar. The triangulation algorithm is used and stereo correspondence points are marked manually by two independent experts. To calibrate the cameras two perpendicular chessboards, a pinhole camera model and a Tsai algorithm are used. The data set consists of twelve real liver surgery video sequences: ten open surgery and two laparoscopic, gathered from different patients. The setup equipment and methodology are presented. The proposed evaluation method based on both calibration points of the chessboard reconstruction and measurements made by the Polaris Vicra tracking system are used as a reference system. In the analysis stage we focused on two specific goals, measuring respiration and laparoscopic tool guided liver motions. We have presented separate examples for left and right liver lobes. It is possible to reconstruct liver motion using the SILS trocar. Our approach was made without additional position or movement sensors. Diffusion of cameras and laser for distance measurement seems to be less practical for in vivo laparoscopic data, but we do not exclude exploring such sensors in further research. PMID:23256023

  8. Reconstruction in Fourier space

    NASA Astrophysics Data System (ADS)

    Burden, A.; Percival, W. J.; Howlett, C.

    2015-10-01

    We present a fast iterative fast Fourier transform (FFT) based reconstruction algorithm that allows for non-parallel redshift-space distortions (RSDs). We test our algorithm on both N-body dark matter simulations and mock distributions of galaxies designed to replicate galaxy survey conditions. We compare solenoidal and irrotational components of the redshift distortion and show that an approximation of this distortion leads to a better estimate of the real-space potential (and therefore faster convergence) than ignoring the RSD when estimating the displacement field. Our iterative reconstruction scheme converges in two iterations for the mock samples corresponding to Baryon Oscillation Spectroscopic Survey CMASS Data Release 11 when we start with an approximation of the RSD. The scheme takes six iterations when the initial estimate, measured from the redshift-space overdensity, has no RSD correction. Slower convergence would be expected for surveys covering a larger angle on the sky. We show that this FFT based method provides a better estimate of the real-space displacement field than a configuration space method that uses finite difference routines to compute the potential for the same grid resolution. Finally, we show that a lognormal transform of the overdensity, used as a proxy for the linear overdensity, is beneficial in estimating the full displacement field from a dense sample of tracers. However, the lognormal transform of the overdensity does not perform well when estimating the displacements from sparser simulations with a more realistic galaxy density.

  9. Reconstructing Star Formation Histories of Galaxies

    NASA Astrophysics Data System (ADS)

    Fritze-v. Alvensleben, U.; Lilly, T.

    2007-12-01

    We present a methodological study to find out how far back and to what precision star formation histories of galaxies can be reconstructed from CMDs, from integrated spectra and Lick indices, and from integrated multi-band photometry. Our evolutionary synthesis models GALEV allow to describe the evolution of galaxies in terms of all three approaches and we have assumed typical observational uncertainties for each of them and then investigated to what extent and accuracy different star formation histories can be discriminated. For a field in the LMC bar region with both a deep CMD from HST observations and a trailing slit spectrum across exactly the same field of view we could test our modelling results against real data.

  10. Calibration requirements for detecting the 21 cm epoch of reionization power spectrum and implications for the SKA

    NASA Astrophysics Data System (ADS)

    Barry, N.; Hazelton, B.; Sullivan, I.; Morales, M. F.; Pober, J. C.

    2016-09-01

    21 cm epoch of reionization (EoR) observations promise to transform our understanding of galaxy formation, but these observations are impossible without unprecedented levels of instrument calibration. We present end-to-end simulations of a full EoR power spectrum (PS) analysis including all of the major components of a real data processing pipeline: models of astrophysical foregrounds and EoR signal, frequency-dependent instrument effects, sky-based antenna calibration, and the full PS analysis. This study reveals that traditional sky-based per-frequency antenna calibration can only be implemented in EoR measurement analyses if the calibration model is unrealistically accurate. For reasonable levels of catalogue completeness, the calibration introduces contamination in otherwise foreground-free PS modes, precluding a PS measurement. We explore the origin of this contamination and potential mitigation techniques. We show that there is a strong joint constraint on the precision of the calibration catalogue and the inherent spectral smoothness of antennas, and that this has significant implications for the instrumental design of the SKA (Square Kilometre Array) and other future EoR observatories.

  11. Exercises in PET Image Reconstruction

    NASA Astrophysics Data System (ADS)

    Nix, Oliver

    These exercises are complementary to the theoretical lectures about positron emission tomography (PET) image reconstruction. They aim at providing some hands on experience in PET image reconstruction and focus on demonstrating the different data preprocessing steps and reconstruction algorithms needed to obtain high quality PET images. Normalisation, geometric-, attenuation- and scatter correction are introduced. To explain the necessity of those some basics about PET scanner hardware, data acquisition and organisation are reviewed. During the course the students use a software application based on the STIR (software for tomographic image reconstruction) library 1,2 which allows them to dynamically select or deselect corrections and reconstruction methods as well as to modify their most important parameters. Following the guided tutorial, the students get an impression on the effect the individual data precorrections have on image quality and what happens if they are forgotten. Several data sets in sinogram format are provided, such as line source data, Jaszczak phantom data sets with high and low statistics and NEMA whole body phantom data. The two most frequently used reconstruction algorithms in PET image reconstruction, filtered back projection (FBP) and the iterative OSEM (ordered subset expectation maximation) approach are used to reconstruct images. The exercise should help the students gaining an understanding what the reasons for inferior image quality and artefacts are and how to improve quality by a clever choice of reconstruction parameters.

  12. Hybrid spread spectrum radio system

    DOEpatents

    Smith, Stephen F [London, TN; Dress, William B [Camas, WA

    2010-02-09

    Systems and methods are described for hybrid spread spectrum radio systems. A method, includes receiving a hybrid spread spectrum signal including: fast frequency hopping demodulating and direct sequence demodulating a direct sequence spread spectrum signal, wherein multiple frequency hops occur within a single data-bit time and each bit is represented by chip transmissions at multiple frequencies.

  13. Acoustooptical spectrum analysis modeling

    NASA Astrophysics Data System (ADS)

    Carmody, M. J.

    1981-06-01

    A summary of Bragg deflection theory and various approaches to direct detection acoustooptic spectrum analysis (AOSA) modeling is presented. A suitable model is chosen and extended to include the effects of diffraction efficiency, transducer efficiency, irradiance profiles of incident laser illumination, aperture size of the Bragg cell, and the acoustic attenuation experienced by the acoustic wavetrain generated by the input r-f signal. A FORTRAN program is developed to model the AOSA and predict the output image plane intensity profiles. A second version of the program includes a time variable permitting dynamic simulation of the system response.

  14. Energy spectrum and transport in narrow HgTe quantum wells

    SciTech Connect

    Germanenko, A. V.; Minkov, G. M.; Rut, O. E.; Sherstobitov, A. A.; Dvoretsky, S. A.; Mikhailov, N. N.

    2015-01-15

    The results of an experimental study of the transport phenomena and the hole energy spectrum of two-dimensional systems in the quantum well of HgTe zero-gap semiconductor with normal arrangement of quantum-confinement subbands are presented. An analysis of the experimental data allows us to reconstruct the carrier energy spectrum near the hole subband extrema. The results are interpreted using the standard kP model.

  15. Fractal reconstruction of rough membrane surface related with membrane fouling in a membrane bioreactor.

    PubMed

    Zhang, Meijia; Chen, Jianrong; Ma, Yuanjun; Shen, Liguo; He, Yiming; Lin, Hongjun

    2016-09-01

    In this paper, fractal reconstruction of rough membrane surface with a modified Weierstrass-Mandelbrot (WM) function was conducted. The topography of rough membrane surface was measured by an atomic force microscopy (AFM), and the results showed that the membrane surface was isotropous. Accordingly, the fractal dimension and roughness of membrane surface were calculated by the power spectrum method. The rough membrane surface was reconstructed on the MATLAB platform with the parameter values acquired from raw AFM data. The reconstructed membrane was much similar to the real membrane morphology measured by AFM. The parameters (including average roughness and root mean square (RMS) roughness) associated with membrane morphology for the model and real membrane were calculated, and a good match of roughness parameters between the reconstructed surface and real membrane was found, indicating the feasibility of the new developed method. The reconstructed membrane surface can be potentially used for interaction energy evaluation. PMID:27318159

  16. Fractal reconstruction of rough membrane surface related with membrane fouling in a membrane bioreactor.

    PubMed

    Zhang, Meijia; Chen, Jianrong; Ma, Yuanjun; Shen, Liguo; He, Yiming; Lin, Hongjun

    2016-09-01

    In this paper, fractal reconstruction of rough membrane surface with a modified Weierstrass-Mandelbrot (WM) function was conducted. The topography of rough membrane surface was measured by an atomic force microscopy (AFM), and the results showed that the membrane surface was isotropous. Accordingly, the fractal dimension and roughness of membrane surface were calculated by the power spectrum method. The rough membrane surface was reconstructed on the MATLAB platform with the parameter values acquired from raw AFM data. The reconstructed membrane was much similar to the real membrane morphology measured by AFM. The parameters (including average roughness and root mean square (RMS) roughness) associated with membrane morphology for the model and real membrane were calculated, and a good match of roughness parameters between the reconstructed surface and real membrane was found, indicating the feasibility of the new developed method. The reconstructed membrane surface can be potentially used for interaction energy evaluation.

  17. Metrological digital audio reconstruction

    DOEpatents

    Fadeyev; Vitaliy , Haber; Carl

    2004-02-19

    Audio information stored in the undulations of grooves in a medium such as a phonograph record may be reconstructed, with little or no contact, by measuring the groove shape using precision metrology methods coupled with digital image processing and numerical analysis. The effects of damage, wear, and contamination may be compensated, in many cases, through image processing and analysis methods. The speed and data handling capacity of available computing hardware make this approach practical. Two examples used a general purpose optical metrology system to study a 50 year old 78 r.p.m. phonograph record and a commercial confocal scanning probe to study a 1920's celluloid Edison cylinder. Comparisons are presented with stylus playback of the samples and with a digitally re-mastered version of an original magnetic recording. There is also a more extensive implementation of this approach, with dedicated hardware and software.

  18. Autistic spectrum disorders.

    PubMed

    Singhania, Rajeshree

    2005-04-01

    Autistic spectrum disorders is a complex developmental disorder with social and communication dysfunction at its core. It has a wide clinical spectrum with a common triad of impairments -- social communication, social interaction and social imagination. Even mild or subtle difficulties can have a profound and devastating impact on the child. To be able to provide suitable treatments and interventions the distinctive way of thinking and learning of autistic children has to be understood. The core areas of social, emotional, communication and language deficits have to be addressed at all levels of functioning. The important goals of assessment include a categorical diagnosis of autism that looks at differential diagnosis, a refined precise documentation of the child's functioning in various developmental domains and ascertaining presence of co-morbid conditions. The interventions have to be adapted to the individual's chronological age, developmental phase and level of functioning. The strategies of curriculum delivery and teaching the child with autism is distinctive and includes presence of structure to increase predictability and strategies to reduce arousal of anxiety.

  19. Reconstructing the Alcatraz escape

    NASA Astrophysics Data System (ADS)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  20. Reconstruction of an ablated breast.

    PubMed

    Scarfì, A; Ordemann, K; Hüter, J

    1986-01-01

    It is the aim of the reconstruction of an ablated breast to repair the woman's integrity. The technique of this operation, according to Bomert, is the sliding of a flap of skin in the case of a horizontal breast scar. For the reconstruction, a silicone prosthesis is implanted which in most cases is prepectoral.

  1. Reconstruction of Skull Base Defects.

    PubMed

    Klatt-Cromwell, Cristine N; Thorp, Brian D; Del Signore, Anthony G; Ebert, Charles S; Ewend, Matthew G; Zanation, Adam M

    2016-02-01

    "Endoscopic endonasal skull base surgery has dramatically changed and expanded over recent years due to significant advancements in instrumentation, techniques, and anatomic understanding. With these advances, the need for more robust skull base reconstructive techniques was vital. In this article, reconstructive options ranging from acellular grafts to vascular flaps are described, including the strengths, weaknesses, and common uses."

  2. Predicting success in ACL reconstruction.

    PubMed

    Shalvoy, Robert M

    2014-11-03

    Anterior Cruciate Ligament (ACL) injury and ACL reconstruction is common in the United States. However, when compared to the standards of other orthopedics procedures today, ACL reconstruction is NOT predictably successful in restoring patients to their pre-injury state. Only 60-70% of reconstructed patients resume their previous level of activity and many patients experience some degree of osteoarthritis. The reasons for such limitations of success are many. A recent renewal of interest in the many variables affecting ACL reconstruction and the understanding of the varying needs of patients with ACL injury holds promise for improving success even today as well as ultimately providing a normal knee for patients after ACL reconstruction.

  3. Nasal Reconstruction: Extending the Limits

    PubMed Central

    Corsten, Marcus; Haack, Sebastian; Gubisch, Wolfgang M.; Fischer, Helmut

    2016-01-01

    Summary: Reconstructing the 3-dimensional structure of the nose requires the maintenance of its aesthetic form and function. Restoration of the correct dimension, projection, skin quality, symmetrical contour, and function remains problematic. Consequently, modern approaches of nasal reconstruction aim at rebuilding the units rather than just covering the defect. However, revising or redoing a failed or insufficient reconstruction remains very challenging and requires experience and creativity. Here, we present a very particular case with a male patient, who underwent 37 operations elsewhere and presented with a failed nasal reconstruction. We describe and illustrate the complex steps of the nasal rereconstruction, including the reconstruction of the forehead donor site, surgical delay procedures for lining, and the coverage with a third paramedian forehead flap. PMID:27536483

  4. Nasal Reconstruction: Extending the Limits.

    PubMed

    Rezaeian, Farid; Corsten, Marcus; Haack, Sebastian; Gubisch, Wolfgang M; Fischer, Helmut

    2016-07-01

    Reconstructing the 3-dimensional structure of the nose requires the maintenance of its aesthetic form and function. Restoration of the correct dimension, projection, skin quality, symmetrical contour, and function remains problematic. Consequently, modern approaches of nasal reconstruction aim at rebuilding the units rather than just covering the defect. However, revising or redoing a failed or insufficient reconstruction remains very challenging and requires experience and creativity. Here, we present a very particular case with a male patient, who underwent 37 operations elsewhere and presented with a failed nasal reconstruction. We describe and illustrate the complex steps of the nasal rereconstruction, including the reconstruction of the forehead donor site, surgical delay procedures for lining, and the coverage with a third paramedian forehead flap. PMID:27536483

  5. Convex accelerated maximum entropy reconstruction

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-04-01

    Maximum entropy (MaxEnt) spectral reconstruction methods provide a powerful framework for spectral estimation of nonuniformly sampled datasets. Many methods exist within this framework, usually defined based on the magnitude of a Lagrange multiplier in the MaxEnt objective function. An algorithm is presented here that utilizes accelerated first-order convex optimization techniques to rapidly and reliably reconstruct nonuniformly sampled NMR datasets using the principle of maximum entropy. This algorithm - called CAMERA for Convex Accelerated Maximum Entropy Reconstruction Algorithm - is a new approach to spectral reconstruction that exhibits fast, tunable convergence in both constant-aim and constant-lambda modes. A high-performance, open source NMR data processing tool is described that implements CAMERA, and brief comparisons to existing reconstruction methods are made on several example spectra.

  6. Reconstructive challenges in war wounds

    PubMed Central

    Bhandari, Prem Singh; Maurya, Sanjay; Mukherjee, Mrinal Kanti

    2012-01-01

    War wounds are devastating with extensive soft tissue and osseous destruction and heavy contamination. War casualties generally reach the reconstructive surgery centre after a delayed period due to additional injuries to the vital organs. This delay in their transfer to a tertiary care centre is responsible for progressive deterioration in wound conditions. In the prevailing circumstances, a majority of war wounds undergo delayed reconstruction, after a series of debridements. In the recent military conflicts, hydrosurgery jet debridement and negative pressure wound therapy have been successfully used in the preparation of war wounds. In war injuries, due to a heavy casualty load, a faster and reliable method of reconstruction is aimed at. Pedicle flaps in extremities provide rapid and reliable cover in extremity wounds. Large complex defects can be reconstructed using microvascular free flaps in a single stage. This article highlights the peculiarities and the challenges encountered in the reconstruction of these ghastly wounds. PMID:23162233

  7. Evidence-Based ACL Reconstruction

    PubMed Central

    Rodriguez-Merchan, E. Carlos

    2015-01-01

    There is controversy in the literature regarding a number of topics related to anterior cruciate ligament (ACL) reconstruction. The purpose of this article is to answer the following questions: 1) Bone-patellar tendon-bone reconstruction (BPTB-R) or hamstrimg reconstruction (H-R); 2) Double bundle or single bundle; 3) Allograft or authograft; 4) Early or late reconstruction; 5) Rate of return to sports after ACL reconstruction; 6) Rate of osteoarthritis after ACL reconstruction. A Cochrane Library and PubMed (MEDLINE) search of systematic reviews and meta-analysis related to ACL reconstruction was performed. The key words were: ACL reconstruction, systematic reviews and meta-analysis. The main criteria for selection were that the articles were systematic reviews and meta-analyses focused on the aforementioned questions. Sixty-nine articles were found, but only 26 were selected and reviewed because they had a high grade (I-II) of evidence. BPTB-R was associated with better postoperative knee stability but with a higher rate of morbidity. However, the results of both procedures in terms of functional outcome in the long-term were similar. The double-bundle ACL reconstruction technique showed better outcomes in rotational laxity, although functional recovery was similar between single-bundle and double-bundle. Autograft yielded better results than allograft. There was no difference between early and delayed reconstruction. 82% of patients were able to return to some kind of sport participation. 28% of patients presented radiological signs of osteoarthritis with a follow-up of minimum 10 years. PMID:25692162

  8. Blob-enhanced reconstruction technique

    NASA Astrophysics Data System (ADS)

    Castrillo, Giusy; Cafiero, Gioacchino; Discetti, Stefano; Astarita, Tommaso

    2016-09-01

    A method to enhance the quality of the tomographic reconstruction and, consequently, the 3D velocity measurement accuracy, is presented. The technique is based on integrating information on the objects to be reconstructed within the algebraic reconstruction process. A first guess intensity distribution is produced with a standard algebraic method, then the distribution is rebuilt as a sum of Gaussian blobs, based on location, intensity and size of agglomerates of light intensity surrounding local maxima. The blobs substitution regularizes the particle shape allowing a reduction of the particles discretization errors and of their elongation in the depth direction. The performances of the blob-enhanced reconstruction technique (BERT) are assessed with a 3D synthetic experiment. The results have been compared with those obtained by applying the standard camera simultaneous multiplicative reconstruction technique (CSMART) to the same volume. Several blob-enhanced reconstruction processes, both substituting the blobs at the end of the CSMART algorithm and during the iterations (i.e. using the blob-enhanced reconstruction as predictor for the following iterations), have been tested. The results confirm the enhancement in the velocity measurements accuracy, demonstrating a reduction of the bias error due to the ghost particles. The improvement is more remarkable at the largest tested seeding densities. Additionally, using the blobs distributions as a predictor enables further improvement of the convergence of the reconstruction algorithm, with the improvement being more considerable when substituting the blobs more than once during the process. The BERT process is also applied to multi resolution (MR) CSMART reconstructions, permitting simultaneously to achieve remarkable improvements in the flow field measurements and to benefit from the reduction in computational time due to the MR approach. Finally, BERT is also tested on experimental data, obtaining an increase of the

  9. Revision Anterior Cruciate Ligament Reconstruction

    PubMed Central

    Wilde, Jeffrey; Bedi, Asheesh; Altchek, David W.

    2014-01-01

    Context: Reconstruction of the anterior cruciate ligament (ACL) is one of the most common surgical procedures, with more than 200,000 ACL tears occurring annually. Although primary ACL reconstruction is a successful operation, success rates still range from 75% to 97%. Consequently, several thousand revision ACL reconstructions are performed annually and are unfortunately associated with inferior clinical outcomes when compared with primary reconstructions. Evidence Acquisition: Data were obtained from peer-reviewed literature through a search of the PubMed database (1988-2013) as well as from textbook chapters and surgical technique papers. Study Design: Clinical review. Level of Evidence: Level 4. Results: The clinical outcomes after revision ACL reconstruction are largely based on level IV case series. Much of the existing literature is heterogenous with regard to patient populations, primary and revision surgical techniques, concomitant ligamentous injuries, and additional procedures performed at the time of the revision, which limits generalizability. Nevertheless, there is a general consensus that the outcomes for revision ACL reconstruction are inferior to primary reconstruction. Conclusion: Excellent results can be achieved with regard to graft stability, return to play, and functional knee instability but are generally inferior to primary ACL reconstruction. A staged approach with autograft reconstruction is recommended in any circumstance in which a single-stage approach results in suboptimal graft selection, tunnel position, graft fixation, or biological milieu for tendon-bone healing. Strength-of-Recommendation Taxonomy (SORT): Good results may still be achieved with regard to graft stability, return to play, and functional knee instability, but results are generally inferior to primary ACL reconstruction: Level B. PMID:25364483

  10. Evidence-Based ACL Reconstruction.

    PubMed

    Rodriguez-Merchan, E Carlos

    2015-01-01

    There is controversy in the literature regarding a number of topics related to anterior cruciate ligament (ACL) reconstruction. The purpose of this article is to answer the following questions: 1) Bone-patellar tendon-bone reconstruction (BPTB-R) or hamstrimg reconstruction (H-R); 2) Double bundle or single bundle; 3) Allograft or authograft; 4) Early or late reconstruction; 5) Rate of return to sports after ACL reconstruction; 6) Rate of osteoarthritis after ACL reconstruction. A Cochrane Library and PubMed (MEDLINE) search of systematic reviews and meta-analysis related to ACL reconstruction was performed. The key words were: ACL reconstruction, systematic reviews and meta-analysis. The main criteria for selection were that the articles were systematic reviews and meta-analyses focused on the aforementioned questions. Sixty-nine articles were found, but only 26 were selected and reviewed because they had a high grade (I-II) of evidence. BPTB-R was associated with better postoperative knee stability but with a higher rate of morbidity. However, the results of both procedures in terms of functional outcome in the long-term were similar. The double-bundle ACL reconstruction technique showed better outcomes in rotational laxity, although functional recovery was similar between single-bundle and double-bundle. Autograft yielded better results than allograft. There was no difference between early and delayed reconstruction. 82% of patients were able to return to some kind of sport participation. 28% of patients presented radiological signs of osteoarthritis with a follow-up of minimum 10 years. PMID:25692162

  11. Porcelain three-dimensional shape reconstruction and its color reconstruction

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyang; Wu, Haibin; Yang, Xue; Yu, Shuang; Wang, Beiyi; Chen, Deyun

    2013-01-01

    In this paper, structured light three-dimensional measurement technology was used to reconstruct the porcelain shape, and further more the porcelain color was reconstructed. So the accurate reconstruction of the shape and color of porcelain was realized. Our shape measurement installation drawing is given. Because the porcelain surface is color complex and highly reflective, the binary Gray code encoding is used to reduce the influence of the porcelain surface. The color camera was employed to obtain the color of the porcelain surface. Then, the comprehensive reconstruction of the shape and color was realized in Java3D runtime environment. In the reconstruction process, the space point by point coloration method is proposed and achieved. Our coloration method ensures the pixel corresponding accuracy in both of shape and color aspects. The porcelain surface shape and color reconstruction experimental results completed by proposed method and our installation, show that: the depth range is 860 ˜ 980mm, the relative error of the shape measurement is less than 0.1%, the reconstructed color of the porcelain surface is real, refined and subtle, and has the same visual effect as the measured surface.

  12. The marine diversity spectrum

    PubMed Central

    Reuman, Daniel C; Gislason, Henrik; Barnes, Carolyn; Mélin, Frédéric; Jennings, Simon

    2014-01-01

    Distributions of species body sizes within a taxonomic group, for example, mammals, are widely studied and important because they help illuminate the evolutionary processes that produced these distributions. Distributions of the sizes of species within an assemblage delineated by geography instead of taxonomy (all the species in a region regardless of clade) are much less studied but are equally important and will illuminate a different set of ecological and evolutionary processes. We develop and test a mechanistic model of how diversity varies with body mass in marine ecosystems. The model predicts the form of the ‘diversity spectrum’, which quantifies the distribution of species' asymptotic body masses, is a species analogue of the classic size spectrum of individuals, and which we have found to be a new and widely applicable description of diversity patterns. The marine diversity spectrum is predicted to be approximately linear across an asymptotic mass range spanning seven orders of magnitude. Slope −0·5 is predicted for the global marine diversity spectrum for all combined pelagic zones of continental shelf seas, and slopes for large regions are predicted to lie between −0·5 and −0·1. Slopes of −0·5 and −0·1 represent markedly different communities: a slope of −0·5 depicts a 10-fold reduction in diversity for every 100-fold increase in asymptotic mass; a slope of −0·1 depicts a 1·6-fold reduction. Steeper slopes are predicted for larger or colder regions, meaning fewer large species per small species for such regions. Predictions were largely validated by a global empirical analysis. Results explain for the first time a new and widespread phenomenon of biodiversity. Results have implications for estimating numbers of species of small asymptotic mass, where taxonomic inventories are far from complete. Results show that the relationship between diversity and body mass can be explained from the dependence of predation behaviour

  13. Neural reconstruction methods of restoring bladder function

    PubMed Central

    Gomez-Amaya, Sandra M.; Barbe, Mary F.; de Groat, William C.; Brown, Justin M.; Tuite, Gerald F.; Corcos, Jacques; Fecho, Susan B.; Braverman, Alan S.; Ruggieri, Michael R.

    2015-01-01

    During the past century, diverse studies have focused on the development of surgical strategies to restore function of a decentralized bladder after spinal cord or spinal root injury via repair of the original roots or by transferring new axonal sources. The techniques included end-to-end sacral root repairs, transfer of roots from other spinal segments to sacral roots, transfer of intercostal nerves to sacral roots, transfer of various somatic nerves to the pelvic or pudendal nerve, direct reinnervation of the detrusor muscle, or creation of an artificial reflex pathway between the skin and the bladder via the central nervous system. All of these surgical techniques have demonstrated specific strengths and limitations. The findings made to date already indicate appropriate patient populations for each procedure, but a comprehensive assessment of the effectiveness of each technique to restore urinary function after bladder decentralization is required to guide future research and potential clinical application. PMID:25666987

  14. Rotational spectrum of tryptophan.

    PubMed

    Sanz, M Eugenia; Cabezas, Carlos; Mata, Santiago; Alonso, Josè L

    2014-05-28

    The rotational spectrum of the natural amino acid tryptophan has been observed for the first time using a combination of laser ablation, molecular beams, and Fourier transform microwave spectroscopy. Independent analysis of the rotational spectra of individual conformers has conducted to a definitive identification of two different conformers of tryptophan, with one of the observed conformers never reported before. The analysis of the (14)N nuclear quadrupole coupling constants is of particular significance since it allows discrimination between structures, thus providing structural information on the orientation of the amino group. Both observed conformers are stabilized by an O-H···N hydrogen bond in the side chain and a N-H···π interaction forming a chain that reinforce the strength of hydrogen bonds through cooperative effects.

  15. Autism Spectrum Disorders

    PubMed Central

    Hyman, Mark; Swift, Kathie

    2012-01-01

    Autism spectrum disorders (ASDs) are collectively the most commonly diagnosed pediatric neurodevelopmental condition. ASDs include autism, pervasive developmental disorder-not otherwise specified (PDD-NOS), Rett syndrome and Asperger disorder. ASD is characterized by impaired communication and social interaction and may involve developmental delays and seizure disorders. Recent parent-reported diagnosis of ASD in the United States put it at higher levels (1:91) than previously thought, with its diagnosis in boys occurring 4 to 5 times more frequently than in girls (1:58).1 CDC estimates are currently 1:110;1 up from 1:150 in 2007.2 Annual medical expenditures for those affected are generally four to six times greater than for those without ASD.1 While twin studies demonstrate that genetics play a significant role in ASD, the impact of environment should not be underestimated, given the approximate 20-fold increase in incidence over the last 20 years.3 PMID:24278834

  16. Rotational spectrum of tryptophan

    SciTech Connect

    Sanz, M. Eugenia Cabezas, Carlos Mata, Santiago Alonso, Josè L.

    2014-05-28

    The rotational spectrum of the natural amino acid tryptophan has been observed for the first time using a combination of laser ablation, molecular beams, and Fourier transform microwave spectroscopy. Independent analysis of the rotational spectra of individual conformers has conducted to a definitive identification of two different conformers of tryptophan, with one of the observed conformers never reported before. The analysis of the {sup 14}N nuclear quadrupole coupling constants is of particular significance since it allows discrimination between structures, thus providing structural information on the orientation of the amino group. Both observed conformers are stabilized by an O–H···N hydrogen bond in the side chain and a N–H···π interaction forming a chain that reinforce the strength of hydrogen bonds through cooperative effects.

  17. Autism spectrum disorders.

    PubMed

    Fitzgerald, Kara; Hyman, Mark; Swift, Kathie

    2012-09-01

    Autism spectrum disorders (ASDs) are collectively the most commonly diagnosed pediatric neurodevelopmental condition. ASDs include autism, pervasive developmental disorder-not otherwise specified (PDD-NOS), Rett syndrome and Asperger disorder. ASD is characterized by impaired communication and social interaction and may involve developmental delays and seizure disorders. Recent parent-reported diagnosis of ASD in the United States put it at higher levels (1:91) than previously thought, with its diagnosis in boys occurring 4 to 5 times more frequently than in girls (1:58).(1) CDC estimates are currently 1:110;(1) up from 1:150 in 2007.(2) Annual medical expenditures for those affected are generally four to six times greater than for those without ASD.(1) While twin studies demonstrate that genetics play a significant role in ASD, the impact of environment should not be underestimated, given the approximate 20-fold increase in incidence over the last 20 years.(3.)

  18. Rotational Spectrum of Tryptophan

    NASA Astrophysics Data System (ADS)

    Sanz, M. Eugenia; Cabezas, Carlos; Mata, Santiago; Alonso, José L.

    2014-06-01

    The rotational spectrum of the natural amino acid tryptophan has been observed using a recently constructed LA-MB-FTMW spectrometer, specifically designed to optimize the detection of heavier molecules at a lower frequency range. Independent analyses of the rotational spectra of individual conformers have conducted to a definitive identification of two different conformers of tryptophan, with one of the observed conformers never reported before. The experimental values of the 14N nuclear quadrupole coupling constants have been found capital in the discrimination of the conformers. Both observed conformers are stabilized by a O-H\\cdotsN hydrogen bond in the side chain and a N-H\\cdotsπ interaction forming a chain that reinforces the strength of hydrogen bonds through cooperative effects.

  19. Spectrum of anomalous magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Giovannini, Massimo

    2016-05-01

    The equations of anomalous magnetohydrodynamics describe an Abelian plasma where conduction and chiral currents are simultaneously present and constrained by the second law of thermodynamics. At high frequencies the magnetic currents play the leading role, and the spectrum is dominated by two-fluid effects. The system behaves instead as a single fluid in the low-frequency regime where the vortical currents induce potentially large hypermagnetic fields. After deriving the physical solutions of the generalized Appleton-Hartree equation, the corresponding dispersion relations are scrutinized and compared with the results valid for cold plasmas. Hypermagnetic knots and fluid vortices can be concurrently present at very low frequencies and suggest a qualitatively different dynamics of the hydromagnetic nonlinearities.

  20. Narrowband spread spectrum systems

    NASA Astrophysics Data System (ADS)

    Annecke, K. H.; Ottka, M.

    1984-10-01

    The available military radio frequency bands are covered very densely by the already existing conventional systems and therefore the application of bandwidth widening procedures as antijam measures will be allowed only with small spreading factors within these RF-bands. The problems arising from the random code selection for spread spectrum systems with small spreading factors are discussed. The calculations show the dependence between certain statistical properties of classes of codewords and the number of codewords available in these classes. The bit error probabilities in case of jamming by white Gaussian noise, narrowband and CW-jammers are calculated in comparison with the error probability of the class of codewords with ideal correlation properties.

  1. Ocean color spectrum calculations

    NASA Technical Reports Server (NTRS)

    Mccluney, W. R.

    1974-01-01

    There is obvious value in developing the means for measuring a number of subsurface oceanographic parameters using remotely sensed ocean color data. The first step in this effort should be the development of adequate theoretical models relating the desired oceanographic parameters to the upwelling radiances to be observed. A portion of a contributory theoretical model can be described by a modified single scattering approach based on a simple treatment of multiple scattering. The resulting quasisingle scattering model can be used to predict the upwelling distribution of spectral radiance emerging from the sea. The shape of the radiance spectrum predicted by this model for clear ocean water shows encouraging agreement with measurements made at the edge of the Sargasso Sea off Cape Hatteras.

  2. Broad spectrum bioactive sunscreens.

    PubMed

    Velasco, Maria Valéria Robles; Sarruf, Fernanda Daud; Salgado-Santos, Idalina Maria Nunes; Haroutiounian-Filho, Carlos Alberto; Kaneko, Telma Mary; Baby, André Rolim

    2008-11-01

    The development of sunscreens containing reduced concentration of chemical UV filters, even though, possessing broad spectrum effectiveness with the use of natural raw materials that improve and infer UV absorption is of great interest. Due to the structural similarities between polyphenolic compounds and organic UV filters, they might exert photoprotection activity. The objective of the present research work was to develop bioactive sunscreen delivery systems containing rutin, Passiflora incarnata L. and Plantago lanceolata extracts associated or not with organic and inorganic UV filters. UV transmission of the sunscreen delivery system films was performed by using diffuse transmittance measurements coupling to an integrating sphere. In vitro photoprotection efficacy was evaluated according to the following parameters: estimated sun protection factor (SPF); Boot's Star Rating category; UVA/UVB ratio; and critical wavelength (lambda(c)). Sunscreen delivery systems obtained SPF values ranging from 0.972+/-0.004 to 28.064+/-2.429 and bioactive compounds interacted with the UV filters positive and negatively. This behavior may be attributed to: the composition of the delivery system; the presence of inorganic UV filter and quantitative composition of the organic UV filters; and the phytochemical composition of the P. incarnata L. and P. lanceolata extracts. Among all associations of bioactive compounds and UV filters, we found that the broad spectrum sunscreen was accomplished when 1.68% (w/w) P. incarnata L. dry extract was in the presence of 7.0% (w/w) ethylhexyl methoxycinnamate, 2.0% (w/w) benzophenone-3 and 2.0% (w/w) TiO(2). It was demonstrated that this association generated estimated SPF of 20.072+/-0.906 and it has improved the protective defense against UVA radiation accompanying augmentation of the UVA/UVB ratio from 0.49 to 0.52 and lambda(c) from 364 to 368.6nm. PMID:18662760

  3. Anterior Cruciate Ligament Reconstruction

    PubMed Central

    Arcuri, Francisco; Barclay, Fernando; Nacul, Ivan

    2015-01-01

    Introduction: The most recent advances in ACL reconstruction try to reproduce the anatomic femoral and tibial footprints as close as possible. Creating independent tunnels would allow an optimal of the entry point and the femoral tunnel obliquity, and together with an adequate reamer diameter they wouldreproduce with greater certainty the anatomy. Objective: To compare the radiographic parameters of the femoral and tibial tunnel positions in two groups of patients, one operated with a transtibial and other with transportal anatomic techniques. Materials and Methods: From December 2012 to December 2013, 59 patients with a primary ACL reconstruction divided in two groups, a trans tibial technique (TT), 19 patients, and an transportal one (TP) with 40 patients were prospectively evaluated with AP and lateral X-rays. The femoral tunnel angle, the insertion site with respect of the Blumensaat line, the trans osseous distance, the tibial tunnel position as a percentage of the tibial plateau in the AP and lateral views. And finally the tibial tunnel angle in the AP and Lateral views. Results: The femoral tunnel angle was in the TP group of 45,92º and in the TT one 24,53º, p 0,002. The insertion site percentage of the Blumensaat line was of 20,96 in TP and 20,74 in the TT, p 0,681.Trans osseous distance was in the TP of 3,43 cm and in the TT of 4,79 cm, p <0,000. The tibial tunnel position as a percentage in the AP tibial plateau was of 44,35 in TP and of 40,80 TT with a p of 0,076. The tibial tunnel position as a percentage of the lateral tibial plateau was of 28,70 in TP and 34,53 in TT with a p 0,367. Tibial tunnel angle in the AP was of 73,48º in TP and 62,81 in TT with a p of 0,002, and in the lateral plateau of 114,69º in TP and 112,79º in TT with a p of 0,427. Conclusion: It is possible to create tibial and femoral tunnel in optimal positions but not equal between both groups. Creating independent tunnels allow a more anterior and vertical tibial tunnel

  4. Superfine resolution acoustooptic spectrum analysis

    NASA Technical Reports Server (NTRS)

    Ansari, Homayoon; Lesh, James R.

    1991-01-01

    High resolution spectrum analysis of RF signals is required in applications such as the search for extraterrestrial intelligence, RF interference monitoring, or general purpose decomposition of signals. Sub-Hertz resolution in three-dimensional acoustooptic spectrum analysis is theoretically and experimentally demonstrated. The operation of a two-dimensional acoustooptic spectrum analyzer is extended to include time integration over a sequence of CCD frames.

  5. Light propagation analysis using a translated plane angular spectrum method with the oblique plane wave incidence.

    PubMed

    Son, Hyeon-ho; Oh, Kyunghwan

    2015-05-01

    A novel angular spectrum method was proposed to numerically analyze off-axis free-space light propagation on a translated plane to an arbitrary angle. Utilizing a shifted angular spectrum method based on an oblique incident plane wave assumption, a generalized light propagation formulation was obtained in a wide range of both tilt angles and sampling intervals, which overcame the limitations of prior attempts. A detailed comparison of the proposed angular spectrum method with prior methods is numerically presented for diffractive optics and computer-generated holograms. The validity of the proposed method was confirmed experimentally by reconstructing a digital holographic image using a spatial light modulator.

  6. Orthotopic neobladder reconstruction

    PubMed Central

    Chang, Dwayne T. S.; Lawrentschuk, Nathan

    2015-01-01

    Orthotopic neobladder reconstruction is becoming an increasingly common urinary diversion following cystectomy for bladder cancer. This is in recognition of the potential benefits of neobladder surgery over creation of an ileal conduit related to quality of life (QoL), such as avoiding the need to form a stoma with its cosmetic, psychological and other potential complications. The PubMed database was searched using relevant search terms for articles published electronically between January 1994 and April 2014. Full-text articles in English or with English translation were assessed for relevance to the topic before being included in the review. Patients with neobladders have comparable or better post-operative sexual function than those with ileal conduits. They also have comparable QoL to those with ileal conduits. Orthotopic neobladder is a good alternative to ileal conduit in suitable patients who do not want a stoma and are motivated to comply with neobladder training. However, the selection of a neobladder as the urinary diversion of choice requires that patients have good renal and liver functions and are likely to be compliant with neobladder training. With benefits also come potential risks of neobladder formation. These include electrolyte abnormalities and nocturnal incontinence. This short review highlights current aspects of neobladder formation and its potential advantages. PMID:25657535

  7. Total airway reconstruction.

    PubMed

    Connor, Matthew P; Barrera, Jose E; Eller, Robert; McCusker, Scott; O'Connor, Peter

    2013-02-01

    We present a case of obstructive sleep apnea (OSA) that required multilevel surgical correction of the airway and literature review and discuss the role supraglottic laryngeal collapse can have in OSA. A 34-year-old man presented to a tertiary otolaryngology clinic for treatment of OSA. He previously had nasal and palate surgeries and a Repose tongue suspension. His residual apnea hypopnea index (AHI) was 67. He had a dysphonia associated with a true vocal cord paralysis following resection of a benign neck mass in childhood. He also complained of inspiratory stridor with exercise and intolerance to continuous positive airway pressure. Physical examination revealed craniofacial hypoplasia, full base of tongue, and residual nasal airway obstruction. On laryngoscopy, the paretic aryepiglottic fold arytenoid complex prolapsed into the laryngeal inlet with each breath. This was more pronounced with greater respiratory effort. Surgical correction required a series of operations including awake tracheostomy, supraglottoplasty, midline glossectomy, genial tubercle advancement, maxillomandibular advancement, and reconstructive rhinoplasty. His final AHI was 1.9. Our patient's supraglottic laryngeal collapse constituted an area of obstruction not typically evaluated in OSA surgery. In conjunction with treating nasal, palatal, and hypopharyngeal subsites, our patient's supraglottoplasty represented a key component of his success. This case illustrates the need to evaluate the entire upper airway in a complicated case of OSA. PMID:22965285

  8. Facial Reconstruction and Rehabilitation.

    PubMed

    Guntinas-Lichius, Orlando; Genther, Dane J; Byrne, Patrick J

    2016-01-01

    Extracranial infiltration of the facial nerve by salivary gland tumors is the most frequent cause of facial palsy secondary to malignancy. Nevertheless, facial palsy related to salivary gland cancer is uncommon. Therefore, reconstructive facial reanimation surgery is not a routine undertaking for most head and neck surgeons. The primary aims of facial reanimation are to restore tone, symmetry, and movement to the paralyzed face. Such restoration should improve the patient's objective motor function and subjective quality of life. The surgical procedures for facial reanimation rely heavily on long-established techniques, but many advances and improvements have been made in recent years. In the past, published experiences on strategies for optimizing functional outcomes in facial paralysis patients were primarily based on small case series and described a wide variety of surgical techniques. However, in the recent years, larger series have been published from high-volume centers with significant and specialized experience in surgical and nonsurgical reanimation of the paralyzed face that have informed modern treatment. This chapter reviews the most important diagnostic methods used for the evaluation of facial paralysis to optimize the planning of each individual's treatment and discusses surgical and nonsurgical techniques for facial rehabilitation based on the contemporary literature.

  9. Crystallographic image reconstruction problem

    NASA Astrophysics Data System (ADS)

    ten Eyck, Lynn F.

    1993-11-01

    The crystallographic X-ray diffraction experiment gives the amplitudes of the Fourier series expansion of the electron density distribution within the crystal. The 'phase problem' in crystallography is the determination of the phase angles of the Fourier coefficients required to calculate the Fourier synthesis and reveal the molecular structure. The magnitude of this task varies enormously as the size of the structures ranges from a few atoms to thousands of atoms, and the number of Fourier coefficients ranges from hundreds to hundreds of thousands. The issue is further complicated for large structures by limited resolution. This problem is solved for 'small' molecules (up to 200 atoms and a few thousand Fourier coefficients) by methods based on probabilistic models which depend on atomic resolution. These methods generally fail for larger structures such as proteins. The phase problem for protein molecules is generally solved either by laborious experimental methods or by exploiting known similarities to solved structures. Various direct methods have been attempted for very large structures over the past 15 years, with gradually improving results -- but so far no complete success. This paper reviews the features of the crystallographic image reconstruction problem which render it recalcitrant, and describes recent encouraging progress in the application of maximum entropy methods to this problem.

  10. Tomographic reconstruction of binary fields

    NASA Astrophysics Data System (ADS)

    Roux, Stéphane; Leclerc, Hugo; Hild, François

    2012-09-01

    A novel algorithm is proposed for reconstructing binary images from their projection along a set of different orientations. Based on a nonlinear transformation of the projection data, classical back-projection procedures can be used iteratively to converge to the sought image. A multiscale implementation allows for a faster convergence. The algorithm is tested on images up to 1 Mb definition, and an error free reconstruction is achieved with a very limited number of projection data, saving a factor of about 100 on the number of projections required for classical reconstruction algorithms.

  11. Verification of rain-flow reconstructions of a variable amplitude load history

    NASA Astrophysics Data System (ADS)

    Clothiaux, John D.; Dowling, Norman E.

    1992-10-01

    The suitability of using rain-flow reconstructions as an alternative to an original loading spectrum for component fatigue life testing is investigated. A modified helicopter maneuver history is used for the rain-flow cycle counting and history regenerations. Experimental testing on a notched test specimen over a wide range of loads produces similar lives for the original history and the reconstructions. The test lives also agree with a simplified local strain analysis performed on the specimen utilizing the rain-flow cycle count. The rain-flow reconstruction technique is shown to be a viable test spectrum alternative to storing the complete original load history, especially in saving computer storage space and processing time. A description of the regeneration method, the simplified life prediction analysis, and the experimental methods are included in the investigation.

  12. Hologram-reconstruction signal enhancement

    NASA Technical Reports Server (NTRS)

    Mezrich, R. S.

    1977-01-01

    Principle of heterodyne detection is used to combine object beam and reconstructed virtual image beam. All light valves in page composer are opened, and virtual-image beam is allowed to interfere with light from valves.

  13. [Reconstructing oneself after an amputation].

    PubMed

    Anne, M

    2015-03-01

    Anne, in her forties, was the victim of a serious road traffic accident resulting in the amputation of her foot. Demonstrating strength and modesty, she shares with us the main stages of her journey towards her reconstruction. PMID:26145134

  14. Vermilion Reconstruction with Genital Mucosa

    PubMed Central

    Weyandt, Gerhard H.; Woeckel, Achim; Kübler, Alexander C.

    2016-01-01

    Summary: Functional and aesthetical reconstruction, especially of the upper lip after ablative tumor surgery, can be very challenging. The skin of the lip might be sufficiently reconstructed by transpositional flaps from the nasolabial or facial area. Large defects of the lip mucosa, including the vestibule, are even more challenging due to the fact that flaps from the inner lining of the oral cavity often lead to functional impairments. We present a case of multiple vermilion and skin resections of the upper lip. At the last step, we had to resect even the whole vermilion mucosa, including parts of the oral mucosa of the vestibule, leaving a bare orbicularis oris muscle. To reconstruct the mucosal layer, we used a mucosal graft from the labia minora and placed it on the compromised lip and the former transpositional flaps for the reconstructed skin of the upper lip with very good functional and aesthetic results. PMID:27579226

  15. Vermilion Reconstruction with Genital Mucosa.

    PubMed

    Müller-Richter, Urs D A; Weyandt, Gerhard H; Woeckel, Achim; Kübler, Alexander C

    2016-05-01

    Functional and aesthetical reconstruction, especially of the upper lip after ablative tumor surgery, can be very challenging. The skin of the lip might be sufficiently reconstructed by transpositional flaps from the nasolabial or facial area. Large defects of the lip mucosa, including the vestibule, are even more challenging due to the fact that flaps from the inner lining of the oral cavity often lead to functional impairments. We present a case of multiple vermilion and skin resections of the upper lip. At the last step, we had to resect even the whole vermilion mucosa, including parts of the oral mucosa of the vestibule, leaving a bare orbicularis oris muscle. To reconstruct the mucosal layer, we used a mucosal graft from the labia minora and placed it on the compromised lip and the former transpositional flaps for the reconstructed skin of the upper lip with very good functional and aesthetic results. PMID:27579226

  16. SYNTH: A spectrum synthesizer

    NASA Astrophysics Data System (ADS)

    Hensley, W. K.; McKinnon, A. D.; Miley, H. S.; Panisko, M. E.; Savard, R. M.

    1993-10-01

    A computer code has been written at the Pacific Northwest Laboratory (PNL) to synthesize the results of typical gamma ray spectroscopy experiments. The code, dubbed SYNTH, allows a user to specify physical characteristics of a gamma ray source, the quantity of the nuclides producing the radiation, the source-to-detector distance and the presence of absorbers, the type and size of the detector, and the electronic set up used to gather the data. In the process of specifying the parameters needed to synthesize a spectrum, several interesting intermediate results are produced, including a photopeak transmission function versus energy, a detector efficiency curve, and a weighted list of gamma and x rays produced from a set of nuclides. All of these intermediate results are available for graphical inspection and for printing. SYNTH runs on personal computers. It is menu driven and can be customized to user specifications. SYNTH contains robust support for coaxial germanium detectors and some support for sodium iodide detectors. SYNTH is not a finished product. A number of additional developments are planned. However, the existing code has been compared carefully to spectra obtained from National Institute for Standards and Technology (NIST) certified standards with very favorable results. Examples of the use of SYNTH and several spectral results are presented.

  17. PINS Spectrum Identification Guide

    SciTech Connect

    A.J. Caffrey

    2012-03-01

    The Portable Isotopic Neutron Spectroscopy—PINS, for short—system identifies the chemicals inside munitions and containers without opening them, a decided safety advantage if the fill chemical is a hazardous substance like a chemical warfare agent or an explosive. The PINS Spectrum Identification Guide is intended as a reference for technical professionals responsible for the interpretation of PINS gamma-ray spectra. The guide is divided into two parts. The three chapters that constitute Part I cover the science and technology of PINS. Neutron activation analysis is the focus of Chapter 1. Chapter 2 explores PINS hardware, software, and related operational issues. Gamma-ray spectral analysis basics are introduced in Chapter 3. The six chapters of Part II cover the identification of PINS spectra in detail. Like the PINS decision tree logic, these chapters are organized by chemical element: phosphorus-based chemicals, chlorine-based chemicals, etc. These descriptions of hazardous, toxic, and/or explosive chemicals conclude with a chapter on the identification of the inert chemicals, e.g. sand, used to fill practice munitions.

  18. Fetal alcohol spectrum disorders.

    PubMed

    Dörrie, Nora; Föcker, Manuel; Freunscht, Inga; Hebebrand, Johannes

    2014-10-01

    Prenatal alcohol exposure (PAE) is one of the most prevalent and modifiable risk factors for somatic, behavioral, and neurological abnormalities. Affected individuals exhibit a wide range of such features referred to as fetal alcohol spectrum disorders (FASD). These are characterized by a more or less specific pattern of minor facial dysmorphic features, growth deficiency and central nervous system symptoms. Nevertheless, whereas the diagnosis of the full-blown fetal alcohol syndrome does not pose a major challenge, only a tentative diagnosis of FASD can be reached if only mild features are present and/or maternal alcohol consumption during pregnancy cannot be verified. The respective disorders have lifelong implications. The teratogenic mechanisms induced by PAE can lead to various additional somatic findings and structural abnormalities of cerebrum and cerebellum. At the functional level, cognition, motor coordination, attention, language development, executive functions, memory, social perception and emotion processing are impaired to a variable extent. The long-term development is characterized by disruption and failure in many domains; an age-adequate independency is frequently not achieved. In addition to primary prevention, individual therapeutic interventions and tertiary prevention are warranted; provision of extensive education to affected subjects and their caregivers is crucial. Protective environments are often required to prevent negative consequences such as delinquency, indebtedness or experience of physical/sexual abuse.

  19. Fetal Alcohol Spectrum Disorders.

    PubMed

    Williams, Janet F; Smith, Vincent C

    2015-11-01

    Prenatal exposure to alcohol can damage the developing fetus and is the leading preventable cause of birth defects and intellectual and neurodevelopmental disabilities. In 1973, fetal alcohol syndrome was first described as a specific cluster of birth defects resulting from alcohol exposure in utero. Subsequently, research unequivocally revealed that prenatal alcohol exposure causes a broad range of adverse developmental effects. Fetal alcohol spectrum disorder (FASD) is the general term that encompasses the range of adverse effects associated with prenatal alcohol exposure. The diagnostic criteria for fetal alcohol syndrome are specific, and comprehensive efforts are ongoing to establish definitive criteria for diagnosing the other FASDs. A large and growing body of research has led to evidence-based FASD education of professionals and the public, broader prevention initiatives, and recommended treatment approaches based on the following premises:▪ Alcohol-related birth defects and developmental disabilities are completely preventable when pregnant women abstain from alcohol use.▪ Neurocognitive and behavioral problems resulting from prenatal alcohol exposure are lifelong.▪ Early recognition, diagnosis, and therapy for any condition along the FASD continuum can result in improved outcomes.▪ During pregnancy:◦no amount of alcohol intake should be considered safe;◦there is no safe trimester to drink alcohol;◦all forms of alcohol, such as beer, wine, and liquor, pose similar risk; and◦binge drinking poses dose-related risk to the developing fetus.

  20. Fetal alcohol spectrum disorders.

    PubMed

    Dörrie, Nora; Föcker, Manuel; Freunscht, Inga; Hebebrand, Johannes

    2014-10-01

    Prenatal alcohol exposure (PAE) is one of the most prevalent and modifiable risk factors for somatic, behavioral, and neurological abnormalities. Affected individuals exhibit a wide range of such features referred to as fetal alcohol spectrum disorders (FASD). These are characterized by a more or less specific pattern of minor facial dysmorphic features, growth deficiency and central nervous system symptoms. Nevertheless, whereas the diagnosis of the full-blown fetal alcohol syndrome does not pose a major challenge, only a tentative diagnosis of FASD can be reached if only mild features are present and/or maternal alcohol consumption during pregnancy cannot be verified. The respective disorders have lifelong implications. The teratogenic mechanisms induced by PAE can lead to various additional somatic findings and structural abnormalities of cerebrum and cerebellum. At the functional level, cognition, motor coordination, attention, language development, executive functions, memory, social perception and emotion processing are impaired to a variable extent. The long-term development is characterized by disruption and failure in many domains; an age-adequate independency is frequently not achieved. In addition to primary prevention, individual therapeutic interventions and tertiary prevention are warranted; provision of extensive education to affected subjects and their caregivers is crucial. Protective environments are often required to prevent negative consequences such as delinquency, indebtedness or experience of physical/sexual abuse. PMID:24965796