Low-cost, high-speed back-end processing system for high-frequency ultrasound B-mode imaging.
Chang, Jin Ho; Sun, Lei; Yen, Jesse T; Shung, K Kirk
2009-07-01
For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution.
Low-Cost, High-Speed Back-End Processing System for High-Frequency Ultrasound B-Mode Imaging
Chang, Jin Ho; Sun, Lei; Yen, Jesse T.; Shung, K. Kirk
2009-01-01
For real-time visualization of the mouse heart (6 to 13 beats per second), a back-end processing system involving high-speed signal processing functions to form and display images has been developed. This back-end system was designed with new signal processing algorithms to achieve a frame rate of more than 400 images per second. These algorithms were implemented in a simple and cost-effective manner with a single field-programmable gate array (FPGA) and software programs written in C++. The operating speed of the back-end system was investigated by recording the time required for transferring an image to a personal computer. Experimental results showed that the back-end system is capable of producing 433 images per second. To evaluate the imaging performance of the back-end system, a complete imaging system was built. This imaging system, which consisted of a recently reported high-speed mechanical sector scanner assembled with the back-end system, was tested by imaging a wire phantom, a pig eye (in vitro), and a mouse heart (in vivo). It was shown that this system is capable of providing high spatial resolution images with fast temporal resolution. PMID:19574160
Integrated circuits for volumetric ultrasound imaging with 2-D CMUT arrays.
Bhuyan, Anshuman; Choe, Jung Woo; Lee, Byung Chul; Wygant, Ira O; Nikoozadeh, Amin; Oralkan, Ömer; Khuri-Yakub, Butrus T
2013-12-01
Real-time volumetric ultrasound imaging systems require transmit and receive circuitry to generate ultrasound beams and process received echo signals. The complexity of building such a system is high due to requirement of the front-end electronics needing to be very close to the transducer. A large number of elements also need to be interfaced to the back-end system and image processing of a large dataset could affect the imaging volume rate. In this work, we present a 3-D imaging system using capacitive micromachined ultrasonic transducer (CMUT) technology that addresses many of the challenges in building such a system. We demonstrate two approaches in integrating the transducer and the front-end electronics. The transducer is a 5-MHz CMUT array with an 8 mm × 8 mm aperture size. The aperture consists of 1024 elements (32 × 32) with an element pitch of 250 μm. An integrated circuit (IC) consists of a transmit beamformer and receive circuitry to improve the noise performance of the overall system. The assembly was interfaced with an FPGA and a back-end system (comprising of a data acquisition system and PC). The FPGA provided the digital I/O signals for the IC and the back-end system was used to process the received RF echo data (from the IC) and reconstruct the volume image using a phased array imaging approach. Imaging experiments were performed using wire and spring targets, a ventricle model and a human prostrate. Real-time volumetric images were captured at 5 volumes per second and are presented in this paper.
40 CFR 63.497 - Back-end process provisions-monitoring provisions for control and recovery devices.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Back-end process provisions-monitoring... Polymers and Resins § 63.497 Back-end process provisions—monitoring provisions for control and recovery devices. (a) An owner or operator complying with the residual organic HAP limitations in § 63.494(a) using...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 9 2011-07-01 2011-07-01 false Back-end process provisions-monitoring... Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.497 Back-end process... limitations. (a) An owner or operator complying with the residual organic HAP limitations in § 63.494(a)(1...
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
A multitasking, multisinked, multiprocessor data acquisition front end
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, R.; Au, R.; Molen, A.V.
1989-10-01
The authors have developed a generalized data acquisition front end system which is based on MC68020 processors running a commercial real time kernel (rhoSOS), and implemented primarily in a high level language (C). This system has been attached to the back end on-line computing system at NSCL via our high performance ETHERNET protocol. Data may be simultaneously sent to any number of back end systems. Fixed fraction sampling along links to back end computing is also supported. A nonprocedural program generator simplifies the development of experiment specific code.
40 CFR 63.493 - Back-end process provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Standards for Hazardous Air Pollutant Emissions: Group I Polymers and Resins § 63.493 Back-end process provisions. Owners and operators of new and existing affected sources shall comply with the requirements in...
Web-Accessible Scientific Workflow System for Performance Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roelof Versteeg; Roelof Versteeg; Trevor Rowe
2006-03-01
We describe the design and implementation of a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic Javascript and HTML/CSS) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This environment allows for reproducible, transparent result generation by a diverse user base. It has been implemented for several monitoringmore » systems with different degrees of complexity.« less
NASA Astrophysics Data System (ADS)
Mandal, Swagata; Saini, Jogender; Zabołotny, Wojciech M.; Sau, Suman; Chakrabarti, Amlan; Chattopadhyay, Subhasis
2017-03-01
Due to the dramatic increase of data volume in modern high energy physics (HEP) experiments, a robust high-speed data acquisition (DAQ) system is very much needed to gather the data generated during different nuclear interactions. As the DAQ works under harsh radiation environment, there is a fair chance of data corruption due to various energetic particles like alpha, beta, or neutron. Hence, a major challenge in the development of DAQ in the HEP experiment is to establish an error resilient communication system between front-end sensors or detectors and back-end data processing computing nodes. Here, we have implemented the DAQ using field-programmable gate array (FPGA) due to some of its inherent advantages over the application-specific integrated circuit. A novel orthogonal concatenated code and cyclic redundancy check (CRC) have been used to mitigate the effects of data corruption in the user data. Scrubbing with a 32-b CRC has been used against error in the configuration memory of FPGA. Data from front-end sensors will reach to the back-end processing nodes through multiple stages that may add an uncertain amount of delay to the different data packets. We have also proposed a novel memory management algorithm that helps to process the data at the back-end computing nodes removing the added path delays. To the best of our knowledge, the proposed FPGA-based DAQ utilizing optical link with channel coding and efficient memory management modules can be considered as first of its kind. Performance estimation of the implemented DAQ system is done based on resource utilization, bit error rate, efficiency, and robustness to radiation.
Back-end and interface implementation of the STS-XYTER2 prototype ASIC for the CBM experiment
NASA Astrophysics Data System (ADS)
Kasinski, K.; Szczygiel, R.; Zabolotny, W.
2016-11-01
Each front-end readout ASIC for the High-Energy Physics experiments requires robust and effective hit data streaming and control mechanism. A new STS-XYTER2 full-size prototype chip for the Silicon Tracking System and Muon Chamber detectors in the Compressed Baryonic Matter experiment at Facility for Antiproton and Ion Research (FAIR, Germany) is a 128-channel time and amplitude measuring solution for silicon microstrip and gas detectors. It operates at 250 kHit/s/channel hit rate, each hit producing 27 bits of information (5-bit amplitude, 14-bit timestamp, position and diagnostics data). The chip back-end implements fast front-end channel read-out, timestamp-wise hit sorting, and data streaming via a scalable interface implementing the dedicated protocol (STS-HCTSP) for chip control and hit transfer with data bandwidth from 9.7 MHit/s up to 47 MHit/s. It also includes multiple options for link diagnostics, failure detection, and throttling features. The back-end is designed to operate with the data acquisition architecture based on the CERN GBTx transceivers. This paper presents the details of the back-end and interface design and its implementation in the UMC 180 nm CMOS process.
A Failing Grade for the German End-of-Life Vehicles Take-Back System
ERIC Educational Resources Information Center
Nakajima, Nina; Vanderburg, Willem H.
2005-01-01
The German end-of-life vehicle take-back system is described and analyzed in terms of its impact on the environment and the car companies involved. It is concluded that although this system is often cited as an example of a successful take-back scheme, it is not one that maximizes the value recovered from end-of-life vehicles. As a result,…
VLBI2010 Receiver Back End Comparison
NASA Technical Reports Server (NTRS)
Petrachenko, Bill
2013-01-01
VLBI2010 requires a receiver back-end to convert analog RF signals from the receiver front end into channelized digital data streams to be recorded or transmitted electronically. The back end functions are typically performed in two steps: conversion of analog RF inputs into IF bands (see Table 2), and conversion of IF bands into channelized digital data streams (see Tables 1a, 1b and 1c). The latter IF systems are now completely digital and generically referred to as digital back ends (DBEs). In Table 2 two RF conversion systems are compared, and in Tables 1a, 1b, and 1c nine DBE systems are compared. Since DBE designs are advancing rapidly, the data in these tables are only guaranteed to be current near the update date of this document.
A hybrid single-end-access MZI and Φ-OTDR vibration sensing system with high frequency response
NASA Astrophysics Data System (ADS)
Zhang, Yixin; Xia, Lan; Cao, Chunqi; Sun, Zhenhong; Li, Yanting; Zhang, Xuping
2017-01-01
A hybrid single-end-access Mach-Zehnder interferometer (MZI) and phase sensitive OTDR (Φ-OTDR) vibration sensing system is proposed and demonstrated experimentally. In our system, the narrow optical pulses and the continuous wave are injected into the fiber through the front end of the fiber at the same time. And at the rear end of the fiber, a frequency-shift-mirror (FSM) is designed to back propagate the continuous wave modulated by the external vibration. Thus the Rayleigh backscattering signals (RBS) and the back propagated continuous wave interfere with the reference light at the same end of the sensing fiber and a single-end-access configuration is achieved. The RBS can be successfully separated from the interference signal (IS) through digital signal process due to their different intermediate frequency based on frequency division multiplexing technique. There is no influence between these two schemes. The experimental results show 10 m spatial resolution and up to 1.2 MHz frequency response along a 6.35 km long fiber. This newly designed single-end-access setup can achieve vibration events locating and high frequency events response, which can be widely used in health monitoring for civil infrastructures and transportation.
Front-End and Back-End Database Design and Development: Scholar's Academy Case Study
ERIC Educational Resources Information Center
Parks, Rachida F.; Hall, Chelsea A.
2016-01-01
This case study consists of a real database project for a charter school--Scholar's Academy--and provides background information on the school and its cafeteria processing system. Also included are functional requirements and some illustrative data. Students are tasked with the design and development of a database for the purpose of improving the…
Architecture of PAU survey camera readout electronics
NASA Astrophysics Data System (ADS)
Castilla, Javier; Cardiel-Sas, Laia; De Vicente, Juan; Illa, Joseph; Jimenez, Jorge; Maiorino, Marino; Martinez, Gustavo
2012-07-01
PAUCam is a new camera for studying the physics of the accelerating universe. The camera will consist of eighteen 2Kx4K HPK CCDs: sixteen for science and two for guiding. The camera will be installed at the prime focus of the WHT (William Herschel Telescope). In this contribution, the architecture of the readout electronics system is presented. Back- End and Front-End electronics are described. Back-End consists of clock, bias and video processing boards, mounted on Monsoon crates. The Front-End is based on patch panel boards. These boards are plugged outside the camera feed-through panel for signal distribution. Inside the camera, individual preamplifier boards plus kapton cable completes the path to connect to each CCD. The overall signal distribution and grounding scheme is shown in this paper.
NASA Astrophysics Data System (ADS)
Celicourt, P.; Piasecki, M.
2014-12-01
The high cost of hydro-meteorological data acquisition, communication and publication systems along with limited qualified human resources is considered as the main reason why hydro-meteorological data collection remains a challenge especially in developing countries. Despite significant advances in sensor network technologies which gave birth to open hardware and software, low-cost (less than $50) and low-power (in the order of a few miliWatts) sensor platforms in the last two decades, sensors and sensor network deployment remains a labor-intensive, time consuming, cumbersome, and thus expensive task. These factors give rise for the need to develop a affordable, simple to deploy, scalable and self-organizing end-to-end (from sensor to publication) system suitable for deployment in such countries. The design of the envisioned system will consist of a few Sensed-And-Programmed Arduino-based sensor nodes with low-cost sensors measuring parameters relevant to hydrological processes and a Raspberry Pi micro-computer hosting the in-the-field back-end data management. This latter comprises the Python/Django model of the CUAHSI Observations Data Model (ODM) namely DjangODM backed by a PostgreSQL Database Server. We are also developing a Python-based data processing script which will be paired with the data autoloading capability of Django to populate the DjangODM database with the incoming data. To publish the data, the WOFpy (WaterOneFlow Web Services in Python) developed by the Texas Water Development Board for 'Water Data for Texas' which can produce WaterML web services from a variety of back-end database installations such as SQLite, MySQL, and PostgreSQL will be used. A step further would be the development of an appealing online visualization tool using Python statistics and analytics tools (Scipy, Numpy, Pandas) showing the spatial distribution of variables across an entire watershed as a time variant layer on top of a basemap.
Tele-healthcare for diabetes management: A low cost automatic approach.
Benaissa, M; Malik, B; Kanakis, A; Wright, N P
2012-01-01
In this paper, a telemedicine system for managing diabetic patients with better care is presented. The system is an end to end solution which relies on the integration of front end (patient unit) and backend web server. A key feature of the system developed is the very low cost automated approach. The front-end of the system is capable of reading glucose measurements from any glucose meter and sending them automatically via existing networks to the back-end server. The back-end is designed and developed using n-tier web client architecture based on model-view-controller design pattern using open source technology, a cost effective solution. The back-end helps the health-care provider with data analysis; data visualization and decision support, and allows them to send feedback and therapeutic advice to patients from anywhere using a browser enabled device. This system will be evaluated during the trials which will be conducted in collaboration with a local hospital in phased manner.
Source-Constrained Recall: Front-End and Back-End Control of Retrieval Quality
ERIC Educational Resources Information Center
Halamish, Vered; Goldsmith, Morris; Jacoby, Larry L.
2012-01-01
Research on the strategic regulation of memory accuracy has focused primarily on monitoring and control processes used to edit out incorrect information after it is retrieved (back-end control). Recent studies, however, suggest that rememberers also enhance accuracy by preventing the retrieval of incorrect information in the first place (front-end…
NASA Astrophysics Data System (ADS)
Kramer, J. L. A. M.; Ullings, A. H.; Vis, R. D.
1993-05-01
A real-time data acquisition system for microprobe analysis has been developed at the Free University of Amsterdam. The system is composed of two parts: a front-end real-time and a back-end monitoring system. The front-end consists of a VMEbus based system which reads out a CAMAC crate. The back-end is implemented on a Sun work station running the UNIX operating system. This separation allows the integration of a minimal, and consequently very fast, real-time executive within the sophisticated possibilities of advanced UNIX work stations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2014 CFR
2014-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2012 CFR
2012-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
Code of Federal Regulations, 2013 CFR
2013-07-01
... Requirements for the Back-End Process Provisions 8 Table 8 to Subpart U of Part 63 Protection of Environment...: Group I Polymers and Resins Pt. 63, Subpt. U, Table 8 Table 8 to Subpart U of Part 63—Summary of... be monitored Requirements Compliance Using Stripping Technology, Demonstrated through Periodic...
NASA Technical Reports Server (NTRS)
White, Mark
2012-01-01
New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.
Qualification and Reliability for MEMS and IC Packages
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza
2004-01-01
Advanced IC electronic packages are moving toward miniaturization from two key different approaches, front and back-end processes, each with their own challenges. Successful use of more of the back-end process front-end, e.g. microelectromechanical systems (MEMS) Wafer Level Package (WLP), enable reducing size and cost. Use of direct flip chip die is the most efficient approach if and when the issues of know good die and board/assembly are resolved. Wafer level package solve the issue of known good die by enabling package test, but it has its own limitation, e.g., the I/O limitation, additional cost, and reliability. From the back-end approach, system-in-a-package (SIAP/SIP) development is a response to an increasing demand for package and die integration of different functions into one unit to reduce size and cost and improve functionality. MEMS add another challenging dimension to electronic packaging since they include moving mechanical elements. Conventional qualification and reliability need to be modified and expanded in most cases in order to detect new unknown failures. This paper will review four standards that already released or being developed that specifically address the issues on qualification and reliability of assembled packages. Exposures to thermal cycles, monotonic bend test, mechanical shock and drop are covered in these specifications. Finally, mechanical and thermal cycle qualification data generated for MEMS accelerometer will be presented. The MEMS was an element of an inertial measurement unit (IMU) qualified for NASA Mars Exploration Rovers (MERs), Spirit and Opportunity that successfully is currently roaring the Martian surface
Back-illuminate fiber system research for multi-object fiber spectroscopic telescope
NASA Astrophysics Data System (ADS)
Zhou, Zengxiang; Liu, Zhigang; Hu, Hongzhuan; Wang, Jianping; Zhai, Chao; Chu, Jiaru
2016-07-01
In the telescope observation, the position of fiber will highly influence the spectra efficient input in the fiber to the spectrograph. When the fibers were back illuminated on the spectra end, they would export light on the positioner end, so the CCD cameras could capture the photo of fiber tip position covered the focal plane, calculates the precise position information by light centroid method and feeds back to control system. A set of fiber back illuminated system was developed which combined to the low revolution spectro instruments in LAMOST. It could provide uniform light output to the fibers, meet the requirements for the CCD camera measurement. The paper was introduced the back illuminated system design and different test for the light resource. After optimization, the effect illuminated system could compare with the integrating sphere, meet the conditions of fiber position measurement.Using parallel controlled fiber positioner as the spectroscopic receiver is an efficiency observation system for spectra survey, has been used in LAMOST recently, and will be proposed in CFHT and rebuilt telescope Mayall. In the telescope observation, the position of fiber will highly influence the spectra efficient input in the fiber to the spectrograph. When the fibers were back illuminated on the spectra end, they would export light on the positioner end, so the CCD cameras could capture the photo of fiber tip position covered the focal plane, calculates the precise position information by light centroid method and feeds back to control system. After many years on these research, the back illuminated fiber measurement was the best method to acquire the precision position of fibers. In LAMOST, a set of fiber back illuminated system was developed which combined to the low revolution spectro instruments in LAMOST. It could provide uniform light output to the fibers, meet the requirements for the CCD camera measurement and was controlled by high-level observation system which could shut down during the telescope observation. The paper was introduced the back illuminated system design and different test for the light resource. After optimization, the effect illuminated system could compare the integrating sphere, meet the conditions of fiber position measurement.
VO-KOREL: A Fourier Disentangling Service of the Virtual Observatory
NASA Astrophysics Data System (ADS)
Škoda, Petr; Hadrava, Petr; Fuchs, Jan
2012-04-01
VO-KOREL is a web service exploiting the technology of the Virtual Observatory for providing astronomers with the intuitive graphical front-end and distributed computing back-end running the most recent version of the Fourier disentangling code KOREL. The system integrates the ideas of the e-shop basket, conserving the privacy of every user by transfer encryption and access authentication, with features of laboratory notebook, allowing the easy housekeeping of both input parameters and final results, as well as it explores a newly emerging technology of cloud computing. While the web-based front-end allows the user to submit data and parameter files, edit parameters, manage a job list, resubmit or cancel running jobs and mainly watching the text and graphical results of a disentangling process, the main part of the back-end is a simple job queue submission system executing in parallel multiple instances of the FORTRAN code KOREL. This may be easily extended for GRID-based deployment on massively parallel computing clusters. The short introduction into underlying technologies is given, briefly mentioning advantages as well as bottlenecks of the design used.
Perl Extension to the Bproc Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grunau, Daryl W.
2004-06-07
The Beowulf Distributed process Space (Bproc) software stack is comprised of UNIX/Linux kernel modifications and a support library by which a cluster of machines, each running their own private kernel, can present itself as a unified process space to the user. A Bproc cluster contains a single front-end machine and many back-end nodes which receive and run processes given to them by the front-end. Any process which is migrated to a back-end node is also visible as a ghost process on the fron-end, and may be controlled there using traditional UNIX semantics (e.g. ps(1), kill(1), etc). This software is amore » Perl extension to the Bproc library which enables the Perl programmer to make direct calls to functions within the Bproc library. See http://www.clustermatic.org, http://bproc.sourceforge.net, and http://www.perl.org« less
NASA Astrophysics Data System (ADS)
Araoka, Daisuke; Nishio, Yoshiro; Gamo, Toshitaka; Yamaoka, Kyoko; Kawahata, Hodaka
2016-10-01
The Li concentration and isotopic composition (δ7Li) in submarine vent fluids are important for oceanic Li budget and potentially useful for investigating hydrothermal systems deep under the seafloor because hydrothermal vent fluids are highly enriched in Li relative to seawater. Although Li isotopic geochemistry has been studied at mid-ocean-ridge (MOR) hydrothermal sites, in arc and back-arc settings Li isotopic composition has not been systematically investigated. Here we determined the δ7Li and 87Sr/86Sr values of 11 end-member fluids from 5 arc and back-arc hydrothermal systems in the western Pacific and examined Li behavior during high-temperature water-rock interactions in different geological settings. In sediment-starved hydrothermal systems (Manus Basin, Izu-Bonin Arc, Mariana Trough, and North Fiji Basin), the Li concentrations (0.23-1.30 mmol/kg) and δ7Li values (+4.3‰ to +7.2‰) of the end-member fluids are explained mainly by dissolution-precipitation model during high-temperature seawater-rock interactions at steady state. Low Li concentrations are attributable to temperature-related apportioning of Li in rock into the fluid phase and phase separation process. Small variation in Li among MOR sites is probably caused by low-temperature alteration process by diffusive hydrothermal fluids under the seafloor. In contrast, the highest Li concentrations (3.40-5.98 mmol/kg) and lowest δ7Li values (+1.6‰ to +2.4‰) of end-member fluids from the Okinawa Trough demonstrate that the Li is predominantly derived from marine sediments. The variation of Li in sediment-hosted sites can be explained by the differences in degree of hydrothermal fluid-sediment interactions associated with the thickness of the marine sediment overlying these hydrothermal sites.
Low-Power Analog Processing for Sensing Applications: Low-Frequency Harmonic Signal Classification
White, Daniel J.; William, Peter E.; Hoffman, Michael W.; Balkir, Sina
2013-01-01
A low-power analog sensor front-end is described that reduces the energy required to extract environmental sensing spectral features without using Fast Fouriér Transform (FFT) or wavelet transforms. An Analog Harmonic Transform (AHT) allows selection of only the features needed by the back-end, in contrast to the FFT, where all coefficients must be calculated simultaneously. We also show that the FFT coefficients can be easily calculated from the AHT results by a simple back-substitution. The scheme is tailored for low-power, parallel analog implementation in an integrated circuit (IC). Two different applications are tested with an ideal front-end model and compared to existing studies with the same data sets. Results from the military vehicle classification and identification of machine-bearing fault applications shows that the front-end suits a wide range of harmonic signal sources. Analog-related errors are modeled to evaluate the feasibility of and to set design parameters for an IC implementation to maintain good system-level performance. Design of a preliminary transistor-level integrator circuit in a 0.13 μm complementary metal-oxide-silicon (CMOS) integrated circuit process showed the ability to use online self-calibration to reduce fabrication errors to a sufficiently low level. Estimated power dissipation is about three orders of magnitude less than similar vehicle classification systems that use commercially available FFT spectral extraction. PMID:23892765
Active Hearing Mechanisms Inspire Adaptive Amplification in an Acoustic Sensor System.
Guerreiro, Jose; Reid, Andrew; Jackson, Joseph C; Windmill, James F C
2018-06-01
Over many millions of years of evolution, nature has developed some of the most adaptable sensors and sensory systems possible, capable of sensing, conditioning and processing signals in a very power- and size-effective manner. By looking into biological sensors and systems as a source of inspiration, this paper presents the study of a bioinspired concept of signal processing at the sensor level. By exploiting a feedback control mechanism between a front-end acoustic receiver and back-end neuronal based computation, a nonlinear amplification with hysteretic behavior is created. Moreover, the transient response of the front-end acoustic receiver can also be controlled and enhanced. A theoretical model is proposed and the concept is prototyped experimentally through an embedded system setup that can provide dynamic adaptations of a sensory system comprising a MEMS microphone placed in a closed-loop feedback system. It faithfully mimics the mosquito's active hearing response as a function of the input sound intensity. This is an adaptive acoustic sensor system concept that can be exploited by sensor and system designers within acoustics and ultrasonic engineering fields.
Redundancy, Self-Motion, and Motor Control
Martin, V.; Scholz, J. P.; Schöner, G.
2011-01-01
Outside the laboratory, human movement typically involves redundant effector systems. How the nervous system selects among the task-equivalent solutions may provide insights into how movement is controlled. We propose a process model of movement generation that accounts for the kinematics of goal-directed pointing movements performed with a redundant arm. The key element is a neuronal dynamics that generates a virtual joint trajectory. This dynamics receives input from a neuronal timer that paces end-effector motion along its path. Within this dynamics, virtual joint velocity vectors that move the end effector are dynamically decoupled from velocity vectors that do not. Moreover, the sensed real joint configuration is coupled back into this neuronal dynamics, updating the virtual trajectory so that it yields to task-equivalent deviations from the dynamic movement plan. Experimental data from participants who perform in the same task setting as the model are compared in detail to the model predictions. We discover that joint velocities contain a substantial amount of self-motion that does not move the end effector. This is caused by the low impedance of muscle joint systems and by coupling among muscle joint systems due to multiarticulatory muscles. Back-coupling amplifies the induced control errors. We establish a link between the amount of self-motion and how curved the end-effector path is. We show that models in which an inverse dynamics cancels interaction torques predict too little self-motion and too straight end-effector paths. PMID:19718817
Readout, first- and second-level triggers of the new Belle silicon vertex detector
NASA Astrophysics Data System (ADS)
Friedl, M.; Abe, R.; Abe, T.; Aihara, H.; Asano, Y.; Aso, T.; Bakich, A.; Browder, T.; Chang, M. C.; Chao, Y.; Chen, K. F.; Chidzik, S.; Dalseno, J.; Dowd, R.; Dragic, J.; Everton, C. W.; Fernholz, R.; Fujii, H.; Gao, Z. W.; Gordon, A.; Guo, Y. N.; Haba, J.; Hara, K.; Hara, T.; Harada, Y.; Haruyama, T.; Hasuko, K.; Hayashi, K.; Hazumi, M.; Heenan, E. M.; Higuchi, T.; Hirai, H.; Hitomi, N.; Igarashi, A.; Igarashi, Y.; Ikeda, H.; Ishino, H.; Itoh, K.; Iwaida, S.; Kaneko, J.; Kapusta, P.; Karawatzki, R.; Kasami, K.; Kawai, H.; Kawasaki, T.; Kibayashi, A.; Koike, S.; Korpar, S.; Križan, P.; Kurashiro, H.; Kusaka, A.; Lesiak, T.; Limosani, A.; Lin, W. C.; Marlow, D.; Matsumoto, H.; Mikami, Y.; Miyake, H.; Moloney, G. R.; Mori, T.; Nakadaira, T.; Nakano, Y.; Natkaniec, Z.; Nozaki, S.; Ohkubo, R.; Ohno, F.; Okuno, S.; Onuki, Y.; Ostrowicz, W.; Ozaki, H.; Peak, L.; Pernicka, M.; Rosen, M.; Rozanska, M.; Sato, N.; Schmid, S.; Shibata, T.; Stamen, R.; Stanič, S.; Steininger, H.; Sumisawa, K.; Suzuki, J.; Tajima, H.; Tajima, O.; Takahashi, K.; Takasaki, F.; Tamura, N.; Tanaka, M.; Taylor, G. N.; Terazaki, H.; Tomura, T.; Trabelsi, K.; Trischuk, W.; Tsuboyama, T.; Uchida, K.; Ueno, K.; Ueno, K.; Uozaki, N.; Ushiroda, Y.; Vahsen, S.; Varner, G.; Varvell, K.; Velikzhanin, Y. S.; Wang, C. C.; Wang, M. Z.; Watanabe, M.; Watanabe, Y.; Yamada, Y.; Yamamoto, H.; Yamashita, Y.; Yamashita, Y.; Yamauchi, M.; Yanai, H.; Yang, R.; Yasu, Y.; Yokoyama, M.; Ziegler, T.; Žontar, D.
2004-12-01
A major upgrade of the Silicon Vertex Detector (SVD 2.0) of the Belle experiment at the KEKB factory was installed along with new front-end and back-end electronics systems during the summer shutdown period in 2003 to cope with higher particle rates, improve the track resolution and meet the increasing requirements of radiation tolerance. The SVD 2.0 detector modules are read out by VA1TA chips which provide "fast or" (hit) signals that are combined by the back-end FADCTF modules to coarse, but immediate level 0 track trigger signals at rates of several tens of a kHz. Moreover, the digitized detector signals are compared to threshold lookup tables in the FADCTFs to pass on hit information on a single strip basis to the subsequent level 1.5 trigger system, which reduces the rate below the kHz range. Both FADCTF and level 1.5 electronics make use of parallel real-time processing in Field Programmable Gate Arrays (FPGAs), while further data acquisition and event building is done by PC farms running Linux. The new readout system hardware is described and the first results obtained with cosmics are shown.
Interferometric direction finding with a metamaterial detector
NASA Astrophysics Data System (ADS)
Venkatesh, Suresh; Shrekenhamer, David; Xu, Wangren; Sonkusale, Sameer; Padilla, Willie; Schurig, David
2013-12-01
We present measurements and analysis demonstrating useful direction finding of sources in the S band (2-4 GHz) using a metamaterial detector. An augmented metamaterial absorber that supports magnitude and phase measurement of the incident electric field, within each unit cell, is described. The metamaterial is implemented in a commercial printed circuit board process with off-board back-end electronics. We also discuss on-board back-end implementation strategies. Direction finding performance is analyzed for the fabricated metamaterial detector using simulated data and the standard algorithm, MUtiple SIgnal Classification. The performance of this complete system is characterized by its angular resolution as a function of radiation density at the detector. Sources with power outputs typical of mobile communication devices can be resolved at kilometer distances with sub-degree resolution and high frame rates.
Code of Federal Regulations, 2011 CFR
2011-07-01
... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...
Code of Federal Regulations, 2013 CFR
2013-07-01
... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...
Code of Federal Regulations, 2014 CFR
2014-07-01
... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...
Code of Federal Regulations, 2012 CFR
2012-07-01
... current process operating conditions. (iii) Design analysis based on accepted chemical engineering... quantity are production records, measurement of stream characteristics, and engineering calculations. (5...-end process operations using engineering assessment. Engineering assessment includes, but is not...
Integrating RFID technique to design mobile handheld inventory management system
NASA Astrophysics Data System (ADS)
Huang, Yo-Ping; Yen, Wei; Chen, Shih-Chung
2008-04-01
An RFID-based mobile handheld inventory management system is proposed in this paper. Differing from the manual inventory management method, the proposed system works on the personal digital assistant (PDA) with an RFID reader. The system identifies electronic tags on the properties and checks the property information in the back-end database server through a ubiquitous wireless network. The system also provides a set of functions to manage the back-end inventory database and assigns different levels of access privilege according to various user categories. In the back-end database server, to prevent improper or illegal accesses, the server not only stores the inventory database and user privilege information, but also keeps track of the user activities in the server including the login and logout time and location, the records of database accessing, and every modification of the tables. Some experimental results are presented to verify the applicability of the integrated RFID-based mobile handheld inventory management system.
Cantrell, Jennifer; Ganz, Ollie; Ilakkuvan, Vinu; Tacelosky, Michael; Kreslake, Jennifer; Moon-Howard, Joyce; Aidala, Angela; Vallone, Donna; Anesetti-Rothermel, Andrew; Kirchner, Thomas R
2015-01-01
In tobacco control and other fields, point-of-sale surveillance of the retail environment is critical for understanding industry marketing of products and informing public health practice. Innovations in mobile technology can improve existing, paper-based surveillance methods, yet few studies describe in detail how to operationalize the use of technology in public health surveillance. The aims of this paper are to share implementation strategies and lessons learned from 2 tobacco, point-of-sale surveillance projects to inform and prepare public health researchers and practitioners to implement new mobile technologies in retail point-of-sale surveillance systems. From 2011 to 2013, 2 point-of-sale surveillance pilot projects were conducted in Washington, DC, and New York, New York, to capture information about the tobacco retail environment and test the feasibility of a multimodal mobile data collection system, which included capabilities for audio or video recording data, electronic photographs, electronic location data, and a centralized back-end server and dashboard. We established a preimplementation field testing process for both projects, which involved a series of rapid and iterative tests to inform decisions and establish protocols around key components of the project. Important components of field testing included choosing a mobile phone that met project criteria, establishing an efficient workflow and accessible user interfaces for each component of the system, training and providing technical support to fieldworkers, and developing processes to integrate data from multiple sources into back-end systems that can be utilized in real-time. A well-planned implementation process is critical for successful use and performance of multimodal mobile surveillance systems. Guidelines for implementation include (1) the need to establish and allow time for an iterative testing framework for resolving technical and logistical challenges; (2) developing a streamlined workflow and user-friendly interfaces for data collection; (3) allowing for ongoing communication, feedback, and technology-related skill-building among all staff; and (4) supporting infrastructure for back-end data systems. Although mobile technologies are evolving rapidly, lessons learned from these case studies are essential for ensuring that the many benefits of new mobile systems for rapid point-of-sale surveillance are fully realized.
Ganz, Ollie; Ilakkuvan, Vinu; Tacelosky, Michael; Kreslake, Jennifer; Moon-Howard, Joyce; Aidala, Angela; Vallone, Donna; Anesetti-Rothermel, Andrew; Kirchner, Thomas R
2015-01-01
Background In tobacco control and other fields, point-of-sale surveillance of the retail environment is critical for understanding industry marketing of products and informing public health practice. Innovations in mobile technology can improve existing, paper-based surveillance methods, yet few studies describe in detail how to operationalize the use of technology in public health surveillance. Objective The aims of this paper are to share implementation strategies and lessons learned from 2 tobacco, point-of-sale surveillance projects to inform and prepare public health researchers and practitioners to implement new mobile technologies in retail point-of-sale surveillance systems. Methods From 2011 to 2013, 2 point-of-sale surveillance pilot projects were conducted in Washington, DC, and New York, New York, to capture information about the tobacco retail environment and test the feasibility of a multimodal mobile data collection system, which included capabilities for audio or video recording data, electronic photographs, electronic location data, and a centralized back-end server and dashboard. We established a preimplementation field testing process for both projects, which involved a series of rapid and iterative tests to inform decisions and establish protocols around key components of the project. Results Important components of field testing included choosing a mobile phone that met project criteria, establishing an efficient workflow and accessible user interfaces for each component of the system, training and providing technical support to fieldworkers, and developing processes to integrate data from multiple sources into back-end systems that can be utilized in real-time. Conclusions A well-planned implementation process is critical for successful use and performance of multimodal mobile surveillance systems. Guidelines for implementation include (1) the need to establish and allow time for an iterative testing framework for resolving technical and logistical challenges; (2) developing a streamlined workflow and user-friendly interfaces for data collection; (3) allowing for ongoing communication, feedback, and technology-related skill-building among all staff; and (4) supporting infrastructure for back-end data systems. Although mobile technologies are evolving rapidly, lessons learned from these case studies are essential for ensuring that the many benefits of new mobile systems for rapid point-of-sale surveillance are fully realized. PMID:27227138
NASA Astrophysics Data System (ADS)
Krawczyk, Rafał Dominik; Czarski, Tomasz; Linczuk, Paweł; Wojeński, Andrzej; Kolasiński, Piotr; GÄ ska, Michał; Chernyshova, Maryna; Mazon, Didier; Jardin, Axel; Malard, Philippe; Poźniak, Krzysztof; Kasprowicz, Grzegorz; Zabołotny, Wojciech; Kowalska-Strzeciwilk, Ewa; Malinowski, Karol
2018-06-01
This article presents a novel software-defined server-based solutions that were introduced in the fast, real-time computation systems for soft X-ray diagnostics for the WEST (Tungsten Environment in Steady-state Tokamak) reactor in Cadarache, France. The objective of the research was to provide a fast processing of data at high throughput and with low latencies for investigating the interplay between the particle transport and magnetohydrodynamic activity. The long-term objective is to implement in the future a fast feedback signal in the reactor control mechanisms to sustain the fusion reaction. The implemented electronic measurement device is anticipated to be deployed in the WEST. A standalone software-defined computation engine was designed to handle data collected at high rates in the server back-end of the system. Signals are obtained from the front-end field-programmable gate array mezzanine cards that acquire and perform a selection from the gas electron multiplier detector. A fast, authorial library for plasma diagnostics was written in C++. It originated from reference offline MATLAB implementations. They were redesigned for runtime analysis during the experiment in the novel online modes of operation. The implementation allowed the benchmarking, evaluation, and optimization of plasma processing algorithms with the possibility to check the consistency with reference computations written in MATLAB. The back-end software and hardware architecture are presented with data evaluation mechanisms. The online modes of operation for the WEST are discussed. The results concerning the performance of the processing and the introduced functionality are presented.
'Second generation' Internet e-health: the gladiator for HIPAA compliance?
Korpman, R A; Rose, J S
2001-01-01
The Health Insurance Portability and Accountability Act (HIPAA) is intended to simplify administrative processes and improve health information security. There are a number of traditional ways to address the expense and complexities of simplification, but none of them are bargains or beauties to behold: (1) Do-it-yourself encryption; (2) new back-end system purchases; (3) legacy system re-programming; or (4) onerous paper documentation. The good news is that 'second generation' e-health solutions are emerging that act as internal "wrappers" for health plan or provider data systems. They provide both an interface for end-users and a layer of security for organizational information and allow detailed patient-related data to remain at the system owner's physical location. These second generation solutions don't just 'connect,' data, they actually 'understand' the information, and can use data elements to invoke necessary rules, processing pathways, or personalization for specific stakeholders as required by HIPAA.
NASA Astrophysics Data System (ADS)
Lee, Kijeong; Park, Byungjoo; Park, Gil-Cheol
Radio frequency identification (RFID) is a generic term that is used to describe a system that transmits the identity (in the form of a unique serial number) of an object or person wirelessly, using radio waves. However, there are security threats in the RFID system related to its technical components. For example, illegal RFID tag readers can read tag ID and recognize most RFID Readers, a security threat that needs in-depth attention. Previous studies show some ideas on how to minimize these security threats like studying the security protocols between tag, reader and Back-end DB. In this research, the team proposes an RFID Tag ID Subdivision Scheme to authenticate the permitted tag only in USN (Ubiquitous Sensor Network). Using the proposed scheme, the Back-end DB authenticates selected tags only to minimize security threats like eavesdropping and decreasing traffic in Back-end DB.
CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2015-12-01
Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.
Common command-and-control user interface for current force UGS
NASA Astrophysics Data System (ADS)
Stolovy, Gary H.
2009-05-01
The Current Force Unattended Ground Sensors (UGS) comprise the OmniSense, Scorpion, and Silent Watch systems. As deployed by U.S. Army Central Command in 2006, sensor reports from the three systems were integrated into a common Graphical User Interface (GUI), with three separate vendor-specific applications for Command-and-Control (C2) functions. This paper describes the requirements, system architecture, implementation, and testing of an upgrade to the Processing, Exploitation, and Dissemination back-end server to incorporate common remote Command-and-Control capabilities.
Efficient heart beat detection using embedded system electronics
NASA Astrophysics Data System (ADS)
Ramasamy, Mouli; Oh, Sechang; Varadan, Vijay K.
2014-04-01
The present day bio-technical field concentrates on developing various types of innovative ambulatory and wearable devices to monitor several bio-physical, physio-pathological, bio-electrical and bio-potential factors to assess a human body's health condition without intruding quotidian activities. One of the most important aspects of this evolving technology is monitoring heart beat rate and electrocardiogram (ECG) from which many other subsidiary results can be derived. Conventionally, the devices and systems consumes a lot of power since the acquired signals are always processed on the receiver end. Because of this back end processing, the unprocessed raw data is transmitted resulting in usage of more power, memory and processing time. This paper proposes an innovative technique where the acquired signals are processed by a microcontroller in the front end of the module and just the processed signal is then transmitted wirelessly to the display unit. Therefore, power consumption is considerably reduced and clearer data analysis is performed within the module. This also avoids the need for the user to be educated about usage of the device and signal/system analysis, since only the number of heart beats will displayed at the user end. Additionally, the proposed concept also eradicates the other disadvantages like obtrusiveness, high power consumption and size. To demonstrate the above said factors, a commercial controller board was used to extend the monitoring method by using the saved ECG data from a computer.
End-to-end communication test on variable length packet structures utilizing AOS testbed
NASA Technical Reports Server (NTRS)
Miller, Warner H.; Sank, V.; Fong, Wai; Miko, J.; Powers, M.; Folk, John; Conaway, B.; Michael, K.; Yeh, Pen-Shu
1994-01-01
This paper describes a communication test, which successfully demonstrated the transfer of losslessly compressed images in an end-to-end system. These compressed images were first formatted into variable length Consultative Committee for Space Data Systems (CCSDS) packets in the Advanced Orbiting System Testbed (AOST). The CCSDS data Structures were transferred from the AOST to the Radio Frequency Simulations Operations Center (RFSOC), via a fiber optic link, where data was then transmitted through the Tracking and Data Relay Satellite System (TDRSS). The received data acquired at the White Sands Complex (WSC) was transferred back to the AOST where the data was captured and decompressed back to the original images. This paper describes the compression algorithm, the AOST configuration, key flight components, data formats, and the communication link characteristics and test results.
Back-end of the fuel cycle - Indian scenario
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wattal, P.K.
Nuclear power has a key role in meeting the energy demands of India. This can be sustained by ensuring robust technology for the back end of the fuel cycle. Considering the modest indigenous resources of U and a huge Th reserve, India has adopted a three stage Nuclear Power Programme (NPP) based on 'closed fuel cycle' approach. This option on 'Recovery and Recycle' serves twin objectives of ensuring adequate supply of nuclear fuel and also reducing the long term radio-toxicity of the wastes. Reprocessing of the spent fuel by Purex process is currently employed. High Level Liquid Waste (HLW) generatedmore » during reprocessing is vitrified and undergoes interim storage. Back-end technologies are constantly modified to address waste volume minimization and radio-toxicity reduction. Long-term management of HLW in Indian context would involve partitioning of long lived minor actinides and recovery of valuable fission products specifically cesium. Recovery of minor actinides from HLW and its recycle is highly desirable for the sustained growth of India's NPPs. In this context, programme for developing and deploying partitioning technologies on industrial scale is pursued. The partitioned elements could be either transmuted in Fast Reactors (FRs)/Accelerated Driven Systems (ADS) as an integral part of sustainable Indian NPP. (authors)« less
Mantle Flow and Melting Processes Beneath Back-Arc Basins
NASA Astrophysics Data System (ADS)
Hall, P. S.
2007-12-01
The chemical systematics of back-arc basin basalts suggest that multiple mechanisms of melt generation and transport operate simultaneously beneath the back-arc, resulting in a continuum of melts ranging from a relatively dry, MORB-like end-member to a wet, slab-influenced end-member [e.g., Kelley et al., 2006; Langmuir et al., 2006]. Potential melting processes at work include adiabatic decompression melting akin to that at mid-ocean ridges, diapiric upwelling of hydrous and/or partially molten mantle from above the subducting lithospheric slab [e.g., Marsh, 1979; Hall and Kincaid, 2001; Gerya and Yuen, 2003], and melting of back-arc mantle due to a continuous flux of slab-derived hydrous fluid [Kelley et al., 2006]. In this study, we examine the potential for each of these melting mechanisms to contribute to the observed distribution of melts in back-arc basins within the context of upper mantle flow (driven by plate motions) beneath back-arcs, which ultimately controls temperatures within the melting region. Mantle velocities and temperatures are derived from numerical geodynamic models of subduction with back-arc spreading that explicitly include adiabatic decompression melting through a Lagrangian particle scheme and a parameterization of hydrous melting. Dynamical feedback from the melting process occurs through latent heating and viscosity increases related to dehydration. A range of parameters, including subduction rate and trench-back-arc separation distances, is explored. The thermal evolution of individual diapirs is modeled numerically as they traverse the mantle, from nucleation above the subducting slab to melting beneath the back-arc spreading center, and a range of diapir sizes and densities and considered.
Brain Computer Interface on Track to Home.
Miralles, Felip; Vargiu, Eloisa; Dauwalder, Stefan; Solà, Marc; Müller-Putz, Gernot; Wriessnegger, Selina C; Pinegger, Andreas; Kübler, Andrea; Halder, Sebastian; Käthner, Ivo; Martin, Suzanne; Daly, Jean; Armstrong, Elaine; Guger, Christoph; Hintermüller, Christoph; Lowish, Hannah
2015-01-01
The novel BackHome system offers individuals with disabilities a range of useful services available via brain-computer interfaces (BCIs), to help restore their independence. This is the time such technology is ready to be deployed in the real world, that is, at the target end users' home. This has been achieved by the development of practical electrodes, easy to use software, and delivering telemonitoring and home support capabilities which have been conceived, implemented, and tested within a user-centred design approach. The final BackHome system is the result of a 3-year long process involving extensive user engagement to maximize effectiveness, reliability, robustness, and ease of use of a home based BCI system. The system is comprised of ergonomic and hassle-free BCI equipment; one-click software services for Smart Home control, cognitive stimulation, and web browsing; and remote telemonitoring and home support tools to enable independent home use for nonexpert caregivers and users. BackHome aims to successfully bring BCIs to the home of people with limited mobility to restore their independence and ultimately improve their quality of life.
Brain Computer Interface on Track to Home
Miralles, Felip; Dauwalder, Stefan; Müller-Putz, Gernot; Wriessnegger, Selina C.; Pinegger, Andreas; Kübler, Andrea; Halder, Sebastian; Käthner, Ivo; Guger, Christoph; Lowish, Hannah
2015-01-01
The novel BackHome system offers individuals with disabilities a range of useful services available via brain-computer interfaces (BCIs), to help restore their independence. This is the time such technology is ready to be deployed in the real world, that is, at the target end users' home. This has been achieved by the development of practical electrodes, easy to use software, and delivering telemonitoring and home support capabilities which have been conceived, implemented, and tested within a user-centred design approach. The final BackHome system is the result of a 3-year long process involving extensive user engagement to maximize effectiveness, reliability, robustness, and ease of use of a home based BCI system. The system is comprised of ergonomic and hassle-free BCI equipment; one-click software services for Smart Home control, cognitive stimulation, and web browsing; and remote telemonitoring and home support tools to enable independent home use for nonexpert caregivers and users. BackHome aims to successfully bring BCIs to the home of people with limited mobility to restore their independence and ultimately improve their quality of life. PMID:26167530
40 CFR 63.494 - Back-end process provisions-residual organic HAP limitations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... technology or control or recovery devices. (1) For styrene butadiene rubber produced by the emulsion process... rubber produced by any process other than a solution or emulsion process, polybutadiene rubber produced...
NASA Astrophysics Data System (ADS)
Tsukamoto, O.; Utsunomiya, A.
2007-10-01
We propose an HTS bulk bearing flywheel energy system (FWES) with rotor shaft stabilization system using feed-back control of the armature currents of the motor-generator. In the proposed system the rotor shift has a pivot bearing at one end of the shaft and an HTS bulk bearing (SMB) at the other end. The fluctuation of the rotor shaft with SMB is damped by feed-back control of the armature currents of the motor-generator sensing the position of the rotor shaft. The method has merits that the fluctuations are damped without active control magnet bearings and extra devices which may deteriorate the energy storage efficiency and need additional costs. The principle of the method was demonstrated by an experiment using a model permanent magnet motor.
Upgraded Readout Electronics for the ATLAS Liquid Argon Calorimeters at the High Luminosity LHC
NASA Astrophysics Data System (ADS)
Andeen, Timothy R.; ATLAS Liquid Argon Calorimeter Group
2012-12-01
The ATLAS liquid-argon calorimeters produce a total of 182,486 signals which are digitized and processed by the front-end and back-end electronics at every triggered event. In addition, the front-end electronics sum analog signals to provide coarsely grained energy sums, called trigger towers, to the first-level trigger system, which is optimized for nominal LHC luminosities. However, the pile-up background expected during the high luminosity phases of the LHC will be increased by factors of 3 to 7. An improved spatial granularity of the trigger primitives is therefore proposed in order to improve the identification performance for trigger signatures, like electrons or photons, at high background rejection rates. For the first upgrade phase in 2018, new Liquid Argon Trigger Digitizer Boards are being designed to receive higher granularity signals, digitize them on detector and send them via fast optical links to a new, off-detector digital processing system. The digital processing system applies digital filtering and identifies significant energy depositions. The refined trigger primitives are then transmitted to the first level trigger system to extract improved trigger signatures. The general concept of the upgraded liquid-argon calorimeter readout together with the various electronics components to be developed for such a complex system is presented. The research activities and architectural studies undertaken by the ATLAS Liquid Argon Calorimeter Group are described, particularly details of the on-going design of mixed-signal front-end electronics, of radiation tolerant optical-links, and of the high-speed off-detector digital processing system.
Web-based DAQ systems: connecting the user and electronics front-ends
NASA Astrophysics Data System (ADS)
Lenzi, Thomas
2016-12-01
Web technologies are quickly evolving and are gaining in computational power and flexibility, allowing for a paradigm shift in the field of Data Acquisition (DAQ) systems design. Modern web browsers offer the possibility to create intricate user interfaces and are able to process and render complex data. Furthermore, new web standards such as WebSockets allow for fast real-time communication between the server and the user with minimal overhead. Those improvements make it possible to move the control and monitoring operations from the back-end servers directly to the user and to the front-end electronics, thus reducing the complexity of the data acquisition chain. Moreover, web-based DAQ systems offer greater flexibility, accessibility, and maintainability on the user side than traditional applications which often lack portability and ease of use. As proof of concept, we implemented a simplified DAQ system on a mid-range Spartan6 Field Programmable Gate Array (FPGA) development board coupled to a digital front-end readout chip. The system is connected to the Internet and can be accessed from any web browser. It is composed of custom code to control the front-end readout and of a dual soft-core Microblaze processor to communicate with the client.
Status of the photomultiplier-based FlashCam camera for the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Pühlhofer, G.; Bauer, C.; Eisenkolb, F.; Florin, D.; Föhr, C.; Gadola, A.; Garrecht, F.; Hermann, G.; Jung, I.; Kalekin, O.; Kalkuhl, C.; Kasperek, J.; Kihm, T.; Koziol, J.; Lahmann, R.; Manalaysay, A.; Marszalek, A.; Rajda, P. J.; Reimer, O.; Romaszkan, W.; Rupinski, M.; Schanz, T.; Schwab, T.; Steiner, S.; Straumann, U.; Tenzer, C.; Vollhardt, A.; Weitzel, Q.; Winiarski, K.; Zietara, K.
2014-07-01
The FlashCam project is preparing a camera prototype around a fully digital FADC-based readout system, for the medium sized telescopes (MST) of the Cherenkov Telescope Array (CTA). The FlashCam design is the first fully digital readout system for Cherenkov cameras, based on commercial FADCs and FPGAs as key components for digitization and triggering, and a high performance camera server as back end. It provides the option to easily implement different types of trigger algorithms as well as digitization and readout scenarios using identical hardware, by simply changing the firmware on the FPGAs. The readout of the front end modules into the camera server is Ethernet-based using standard Ethernet switches and a custom, raw Ethernet protocol. In the current implementation of the system, data transfer and back end processing rates of 3.8 GB/s and 2.4 GB/s have been achieved, respectively. Together with the dead-time-free front end event buffering on the FPGAs, this permits the cameras to operate at trigger rates of up to several ten kHz. In the horizontal architecture of FlashCam, the photon detector plane (PDP), consisting of photon detectors, preamplifiers, high voltage-, control-, and monitoring systems, is a self-contained unit, mechanically detached from the front end modules. It interfaces to the digital readout system via analogue signal transmission. The horizontal integration of FlashCam is expected not only to be more cost efficient, it also allows PDPs with different types of photon detectors to be adapted to the FlashCam readout system. By now, a 144-pixel mini-camera" setup, fully equipped with photomultipliers, PDP electronics, and digitization/ trigger electronics, has been realized and extensively tested. Preparations for a full-scale, 1764 pixel camera mechanics and a cooling system are ongoing. The paper describes the status of the project.
Ripple-aware optical proximity correction fragmentation for back-end-of-line designs
NASA Astrophysics Data System (ADS)
Wang, Jingyu; Wilkinson, William
2018-01-01
Accurate characterization of image rippling is critical in early detection of back-end-of-line (BEOL) patterning weakpoints, as most defects are strongly associated with excessive rippling that does not get effectively compensated by optical proximity correction (OPC). We correlate image contour with design shapes to account for design geometry-dependent rippling signature, and explore the best practice of OPC fragmentation for BEOL geometries. Specifically, we predict the optimum contour as allowed by the lithographic process and illumination conditions and locate ripple peaks, valleys, and inflection points. This allows us to identify potential process weakpoints and segment the mask accordingly to achieve the best correction results.
Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit
2015-11-10
The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.
An extension of the OpenModelica compiler for using Modelica models in a discrete event simulation
Nutaro, James
2014-11-03
In this article, a new back-end and run-time system is described for the OpenModelica compiler. This new back-end transforms a Modelica model into a module for the adevs discrete event simulation package, thereby extending adevs to encompass complex, hybrid dynamical systems. The new run-time system that has been built within the adevs simulation package supports models with state-events and time-events and that comprise differential-algebraic systems with high index. Finally, although the procedure for effecting this transformation is based on adevs and the Discrete Event System Specification, it can be adapted to any discrete event simulation package.
OPeNDAP Server4: Buidling a High-Performance Server for the DAP by Leveraging Existing Software
NASA Astrophysics Data System (ADS)
Potter, N.; West, P.; Gallagher, J.; Garcia, J.; Fox, P.
2006-12-01
OPeNDAP has been working in conjunction with NCAR/ESSL/HAO to develop a modular, high performance data server that will be the successor to the current OPeNDAP data server. The new server, called Server4, is really two servers: A 'Back-End' data server which reads information from various types of data sources and packages the results in DAP objects; and A 'Front-End' which receives client DAP request and then decides how use features of the Back-End data server to build the correct responses. This architecture can be configured in several interesting ways: The Front- and Back-End components can be run on either the same or different machines, depending on security and performance needs, new Front-End software can be written to support other network data access protocols and local applications can interact directly with the Back-End data server. This new server's Back-End component will use the server infrastructure developed by HAO for use with the Earth System Grid II project. Extensions needed to use it as part of the new OPeNDAP server were minimal. The HAO server was modified so that it loads 'data handlers' at run-time. Each data handler module only needs to satisfy a simple interface which both enabled the existing data handlers written for the old OPeNDAP server to be directly used and also simplifies writing new handlers from scratch. The Back-End server leverages high- performance features developed for the ESG II project, so applications that can interact with it directly can read large volumes of data efficiently. The Front-End module of Server4 uses the Java Servlet system in place of the Common Gateway Interface (CGI) used in the past. New front-end modules can be written to support different network data access protocols, so that same server will ultimately be able to support more than the DAP/2.0 protocol. As an example, we will discuss a SOAP interface that's currently in development. In addition to support for DAP/2.0 and prototypical support for a SOAP interface, the new server includes support for the THREDDS cataloging protocol. THREDDS is tightly integrated into the Front-End of Server4. The Server4 Front-End can make full use of the advanced THREDDS features such as attribute specification and inheritance, custom catalogs which segue into automatically generated catalogs as well as providing a default behavior which requires almost no catalog configuration.
Vortex Generators to Control Boundary Layer Interactions
NASA Technical Reports Server (NTRS)
Babinsky, Holger (Inventor); Loth, Eric (Inventor); Lee, Sang (Inventor)
2014-01-01
Devices for generating streamwise vorticity in a boundary includes various forms of vortex generators. One form of a split-ramp vortex generator includes a first ramp element and a second ramp element with front ends and back ends, ramp surfaces extending between the front ends and the back ends, and vertical surfaces extending between the front ends and the back ends adjacent the ramp surfaces. A flow channel is between the first ramp element and the second ramp element. The back ends of the ramp elements have a height greater than a height of the front ends, and the front ends of the ramp elements have a width greater than a width of the back ends.
Development of the Subaru-Mitaka-Okayama-Kiso Archive System
NASA Astrophysics Data System (ADS)
Baba, Hajime; Yasuda, Naoki; Ichikawa, Shin-Ichi; Yagi, Masafumi; Iwamoto, Nobuyuki; Takata, Tadafumi; Horaguchi, Toshihiro; Taga, Masatoshi; Watanabe, Masaru; Ozawa, Tomohiko; Hamabe, Masaru
We have developed the Subaru-Mitaka-Okayama-Kiso-Archive (SMOKA) public science archive system which provides access to the data of the Subaru Telescope, the 188 cm telescope at Okayama Astrophysical Observatory, and the 105 cm Schmidt telescope at Kiso Observatory/University of Tokyo. SMOKA is the successor of the MOKA3 system. The user can browse the Quick-Look Images, Header Information (HDI) and the ASCII Table Extension (ATE) of each frame from the search result table. A request for data can be submitted in a simple manner. The system is developed with Java Servlet for the back-end, and Java Server Pages (JSP) for content display. The advantage of JSP's is the separation of the front-end presentation from the middle- and back-end tiers which led to an efficient development of the system. The SMOKA homepage is available at SMOKA
Satellite-Tracking Millimeter-Wave Reflector Antenna System For Mobile Satellite-Tracking
NASA Technical Reports Server (NTRS)
Densmore, Arthur C. (Inventor); Jamnejad, Vahraz (Inventor); Woo, Kenneth E. (Inventor)
2001-01-01
A miniature dual-band two-way mobile satellite-tracking antenna system mounted on a movable vehicle includes a miniature parabolic reflector dish having an elliptical aperture with major and minor elliptical axes aligned horizontally and vertically, respectively, to maximize azimuthal directionality and minimize elevational directionality to an extent corresponding to expected pitch excursions of the movable ground vehicle. A feed-horn has a back end and an open front end facing the reflector dish and has vertical side walls opening out from the back end to the front end at a lesser horn angle and horizontal top and bottom walls opening out from the back end to the front end at a greater horn angle. An RF circuit couples two different signal bands between the feed-horn and the user. An antenna attitude controller maintains an antenna azimuth direction relative to the satellite by rotating it in azimuth in response to sensed yaw motions of the movable ground vehicle so as to compensate for the yaw motions to within a pointing error angle. The controller sinusoidally dithers the antenna through a small azimuth dither angle greater than the pointing error angle while sensing a signal from the satellite received at the reflector dish, and deduces the pointing angle error from dither-induced fluctuations in the received signal.
A satellite-tracking millimeter-wave reflector antenna system for mobile satellite-tracking
NASA Technical Reports Server (NTRS)
Densmore, Arthur C. (Inventor); Jamnejad, Vahraz (Inventor); Woo, Kenneth E. (Inventor)
1995-01-01
A miniature dual-band two-way mobile satellite tracking antenna system mounted on a movable ground vehicle includes a miniature parabolic reflector dish having an elliptical aperture with major and minor elliptical axes aligned horizontally and vertically, respectively, to maximize azimuthal directionality and minimize elevational directionality to an extent corresponding to expected pitch excursions of the movable ground vehicle. A feed-horn has a back end and an open front end facing the reflector dish and has vertical side walls opening out from the back end to the front end at a lesser horn angle and horizontal top and bottom walls opening out from the back end to the front end at a greater horn angle. An RF circuit couples two different signal bands between the feed-horn and the user. An antenna attitude controller maintains an antenna azimuth direction relative to the satellite by rotating it in azimuth in response to sensed yaw motions of the movable ground vehicle so as to compensate for the yaw motions to within a pointing error angle. The controller sinusoidally dithers the antenna through a small azimuth dither angle greater than the pointing error angle while sensing a signal from the satellite received at the reflector dish, and deduces the pointing angle error from dither-induced fluctuations in the received signal.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
The Development of Design Tools for Fault Tolerant Quantum Dot Cellular Automata Based Logic
NASA Technical Reports Server (NTRS)
Armstrong, Curtis D.; Humphreys, William M.
2003-01-01
We are developing software to explore the fault tolerance of quantum dot cellular automata gate architectures in the presence of manufacturing variations and device defects. The Topology Optimization Methodology using Applied Statistics (TOMAS) framework extends the capabilities of the A Quantum Interconnected Network Array Simulator (AQUINAS) by adding front-end and back-end software and creating an environment that integrates all of these components. The front-end tools establish all simulation parameters, configure the simulation system, automate the Monte Carlo generation of simulation files, and execute the simulation of these files. The back-end tools perform automated data parsing, statistical analysis and report generation.
Single-snapshot 2D color measurement by plenoptic imaging system
NASA Astrophysics Data System (ADS)
Masuda, Kensuke; Yamanaka, Yuji; Maruyama, Go; Nagai, Sho; Hirai, Hideaki; Meng, Lingfei; Tosic, Ivana
2014-03-01
Plenoptic cameras enable capture of directional light ray information, thus allowing applications such as digital refocusing, depth estimation, or multiband imaging. One of the most common plenoptic camera architectures contains a microlens array at the conventional image plane and a sensor at the back focal plane of the microlens array. We leverage the multiband imaging (MBI) function of this camera and develop a single-snapshot, single-sensor high color fidelity camera. Our camera is based on a plenoptic system with XYZ filters inserted in the pupil plane of the main lens. To achieve high color measurement precision of this system, we perform an end-to-end optimization of the system model that includes light source information, object information, optical system information, plenoptic image processing and color estimation processing. Optimized system characteristics are exploited to build an XYZ plenoptic colorimetric camera prototype that achieves high color measurement precision. We describe an application of our colorimetric camera to color shading evaluation of display and show that it achieves color accuracy of ΔE<0.01.
Electronic hardware design of electrical capacitance tomography systems.
Saied, I; Meribout, M
2016-06-28
Electrical tomography techniques for process imaging are very prominent for industrial applications, such as the oil and gas industry and chemical refineries, owing to their ability to provide the flow regime of a flowing fluid within a relatively high throughput. Among the various techniques, electrical capacitance tomography (ECT) is gaining popularity due to its non-invasive nature and its capability to differentiate between different phases based on their permittivity distribution. In recent years, several hardware designs have been provided for ECT systems that have improved its resolution of measurements to be around attofarads (aF, 10(-18) F), or the number of channels, that is required to be large for some applications that require a significant amount of data. In terms of image acquisition time, some recent systems could achieve a throughput of a few hundred frames per second, while data processing time could be achieved in only a few milliseconds per frame. This paper outlines the concept and main features of the most recent front-end and back-end electronic circuits dedicated for ECT systems. In this paper, multiple-excitation capacitance polling, a front-end electronic technique, shows promising results for ECT systems to acquire fast data acquisition speeds. A highly parallel field-programmable gate array (FPGA) based architecture for a fast reconstruction algorithm is also described. This article is part of the themed issue 'Supersensing through industrial process tomography'. © 2016 The Author(s).
NASA Technical Reports Server (NTRS)
1972-01-01
Information backing up the key features of the manipulator system concept and detailed technical information on the subsystems are presented. Space station assembly and shuttle cargo handling tasks are emphasized in the concept analysis because they involve shuttle berthing, transferring the manipulator boom between shuttle and station, station assembly, and cargo handling. Emphasis is also placed on maximizing commonality in the system areas of manipulator booms, general purpose end effectors, control and display, data processing, telemetry, dedicated computers, and control station design.
ERIC Educational Resources Information Center
Blaszak, Barbara J.
2010-01-01
The author is always looking for material to use in her campaign to end the educational "back-atcha" cycle. It is a multi-generational process, wherein unsophisticated fact-centered high school instruction turns out students resilient against understanding historical discipline despite their college courses; these students then go on to…
BrainIACS: a system for web-based medical image processing
NASA Astrophysics Data System (ADS)
Kishore, Bhaskar; Bazin, Pierre-Louis; Pham, Dzung L.
2009-02-01
We describe BrainIACS, a web-based medical image processing system that permits and facilitates algorithm developers to quickly create extensible user interfaces for their algorithms. Designed to address the challenges faced by algorithm developers in providing user-friendly graphical interfaces, BrainIACS is completely implemented using freely available, open-source software. The system, which is based on a client-server architecture, utilizes an AJAX front-end written using the Google Web Toolkit (GWT) and Java Servlets running on Apache Tomcat as its back-end. To enable developers to quickly and simply create user interfaces for configuring their algorithms, the interfaces are described using XML and are parsed by our system to create the corresponding user interface elements. Most of the commonly found elements such as check boxes, drop down lists, input boxes, radio buttons, tab panels and group boxes are supported. Some elements such as the input box support input validation. Changes to the user interface such as addition and deletion of elements are performed by editing the XML file or by using the system's user interface creator. In addition to user interface generation, the system also provides its own interfaces for data transfer, previewing of input and output files, and algorithm queuing. As the system is programmed using Java (and finally Java-script after compilation of the front-end code), it is platform independent with the only requirements being that a Servlet implementation be available and that the processing algorithms can execute on the server platform.
Wideband Agile Digital Microwave Radiometer
NASA Technical Reports Server (NTRS)
Gaier, Todd C.; Brown, Shannon T.; Ruf, Christopher; Gross, Steven
2012-01-01
The objectives of this work were to take the initial steps needed to develop a field programmable gate array (FPGA)- based wideband digital radiometer backend (>500 MHz bandwidth) that will enable passive microwave observations with minimal performance degradation in a radiofrequency-interference (RFI)-rich environment. As manmade RF emissions increase over time and fill more of the microwave spectrum, microwave radiometer science applications will be increasingly impacted in a negative way, and the current generation of spaceborne microwave radiometers that use broadband analog back ends will become severely compromised or unusable over an increasing fraction of time on orbit. There is a need to develop a digital radiometer back end that, for each observation period, uses digital signal processing (DSP) algorithms to identify the maximum amount of RFI-free spectrum across the radiometer band to preserve bandwidth to minimize radiometer noise (which is inversely related to the bandwidth). Ultimately, the objective is to incorporate all processing necessary in the back end to take contaminated input spectra and produce a single output value free of manmade signals to minimize data rates for spaceborne radiometer missions. But, to meet these objectives, several intermediate processing algorithms had to be developed, and their performance characterized relative to typical brightness temperature accuracy re quirements for current and future microwave radiometer missions, including those for measuring salinity, soil moisture, and snow pack.
A computer-based time study system for timber harvesting operations
Jingxin Wang; Joe McNeel; John Baumgras
2003-01-01
A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...
Visual EKF-SLAM from Heterogeneous Landmarks †
Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José L.
2016-01-01
Many applications require the localization of a moving object, e.g., a robot, using sensory data acquired from embedded devices. Simultaneous localization and mapping from vision performs both the spatial and temporal fusion of these data on a map when a camera moves in an unknown environment. Such a SLAM process executes two interleaved functions: the front-end detects and tracks features from images, while the back-end interprets features as landmark observations and estimates both the landmarks and the robot positions with respect to a selected reference frame. This paper describes a complete visual SLAM solution, combining both point and line landmarks on a single map. The proposed method has an impact on both the back-end and the front-end. The contributions comprehend the use of heterogeneous landmark-based EKF-SLAM (the management of a map composed of both point and line landmarks); from this perspective, the comparison between landmark parametrizations and the evaluation of how the heterogeneity improves the accuracy on the camera localization, the development of a front-end active-search process for linear landmarks integrated into SLAM and the experimentation methodology. PMID:27070602
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarud, J.; Phillips, S.
This presentation provides a technoeconomic comparison of three biofuels - ethanol, methanol, and gasoline - produced by gasification of woody biomass residues. The presentation includes a brief discussion of the three fuels evaluated; discussion of equivalent feedstock and front end processes; discussion of back end processes for each fuel; process comparisons of efficiencies, yields, and water usage; and economic assumptions and results, including a plant gate price (PGP) for each fuel.
Design of Instant Messaging System of Multi-language E-commerce Platform
NASA Astrophysics Data System (ADS)
Yang, Heng; Chen, Xinyi; Li, Jiajia; Cao, Yaru
2017-09-01
This paper aims at researching the message system in the instant messaging system based on the multi-language e-commerce platform in order to design the instant messaging system in multi-language environment and exhibit the national characteristics based information as well as applying national languages to e-commerce. In order to develop beautiful and friendly system interface for the front end of the message system and reduce the development cost, the mature jQuery framework is adopted in this paper. The high-performance server Tomcat is adopted at the back end to process user requests, and MySQL database is adopted for data storage to persistently store user data, and meanwhile Oracle database is adopted as the message buffer for system optimization. Moreover, AJAX technology is adopted for the client to actively pull the newest data from the server at the specified time. In practical application, the system has strong reliability, good expansibility, short response time, high system throughput capacity and high user concurrency.
Kazi, Rubina S; Banarjee, Reema M; Deshmukh, Arati B; Patil, Gouri V; Jagadeeshaprasad, Mashanipalya G; Kulkarni, Mahesh J
2017-03-06
Advanced Glycation End products (AGEs) are implicated in aging process. Thus, reducing AGEs by using glycation inhibitors may help in attenuating the aging process. In this study using Saccharomyces cerevisiae yeast system, we show that Aminoguanidine (AMG), a well-known glycation inhibitor, decreases the AGE modification of proteins in non-calorie restriction (NR) (2% glucose) and extends chronological lifespan (CLS) similar to that of calorie restriction (CR) condition (0.5% glucose). Proteomic analysis revealed that AMG back regulates the expression of differentially expressed proteins especially those involved in mitochondrial respiration in NR condition, suggesting that it switches metabolism from fermentation to respiration, mimicking CR. AMG induced back regulation of differentially expressed proteins could be possibly due to its chemical effect or indirectly by glycation inhibition. To delineate this, Metformin (MET), a structural analog of AMG and a mild glycation inhibitor and Hydralazine (HYD), another potent glycation inhibitor but not structural analog of AMG were used. HYD was more effective than MET in mimicking AMG suggesting that glycation inhibition was responsible for restoration of differentially expressed proteins. Thus glycation inhibitors particularly AMG, HYD and MET extend yeast CLS by reducing AGEs, modulating the expression of proteins involved in mitochondrial respiration and possibly by scavenging glucose. This study reports the role of glycation in aging process. In the non-caloric restriction condition, carbohydrates such as glucose promote protein glycation and reduce CLS. While, the inhibitors of glycation such as AMG, HYD, MET mimic the caloric restriction condition by back regulating deregulated proteins involved in mitochondrial respiration which could facilitate shift of metabolism from fermentation to respiration and extend yeast CLS. These findings suggest that glycation inhibitors can be potential molecules that can be used in management of aging. Copyright © 2017 Elsevier B.V. All rights reserved.
A Web-Based Information System for Field Data Management
NASA Astrophysics Data System (ADS)
Weng, Y. H.; Sun, F. S.
2014-12-01
A web-based field data management system has been designed and developed to allow field geologists to store, organize, manage, and share field data online. System requirements were analyzed and clearly defined first regarding what data are to be stored, who the potential users are, and what system functions are needed in order to deliver the right data in the right way to the right user. A 3-tiered architecture was adopted to create this secure, scalable system that consists of a web browser at the front end while a database at the back end and a functional logic server in the middle. Specifically, HTML, CSS, and JavaScript were used to implement the user interface in the front-end tier, the Apache web server runs PHP scripts, and MySQL to server is used for the back-end database. The system accepts various types of field information, including image, audio, video, numeric, and text. It allows users to select data and populate them on either Google Earth or Google Maps for the examination of the spatial relations. It also makes the sharing of field data easy by converting them into XML format that is both human-readable and machine-readable, and thus ready for reuse.
Willwand, K; Baldauf, A Q; Deleu, L; Mumtsidu, E; Costello, E; Beard, P; Rommelaere, J
1997-10-01
The right-end telomere of replicative form (RF) DNA of the autonomous parvovirus minute virus of mice (MVM) consists of a sequence that is self-complementary except for a three nucleotide loop around the axis of symmetry and an interior bulge of three unpaired nucleotides on one strand (designated the right-end 'bubble'). This right-end inverted repeat can exist in the form of a folded-back strand (hairpin conformation) or in an extended form, base-paired to a copy strand (duplex conformation). We recently reported that the right-end telomere is processed in an A9 cell extract supplemented with the MVM nonstructural protein NS1. This processing is shown here to result from the NS1-dependent nicking of the complementary strand at a unique position 21 nt inboard of the folded-back genomic 5' end. DNA species terminating in duplex or hairpin configurations, or in a mutated structure that has lost the right-end bulge, are all cleaved in the presence of NS1, indicating that features distinguishing these structures are not prerequisites for nicking under the in vitro conditions tested. Cleavage of the hairpin structure is followed by strand-displacement synthesis, generating the right-end duplex conformation, while processing of the duplex structure leads to the release of free right-end telomeres. In the majority of molecules, displacement synthesis at the right terminus stops a few nucleotides before reaching the end of the template strand, possibly due to NS1 which is covalently bound to this end. A fraction of the right-end duplex product undergoes melting and re-folding into hairpin structures (formation of a 'rabbit-ear' structure).
An isocenter estimation tool for proton gantry alignment
NASA Astrophysics Data System (ADS)
Hansen, Peter; Hu, Dongming
2017-12-01
A novel tool has been developed to automate the process of locating the isocenter, center of rotation, and sphere of confusion of a proton therapy gantry. The tool uses a Radian laser tracker to estimate how the coordinate frame of the front-end beam-line components changes as the gantry rotates. The coordinate frames serve as an empirical model of gantry flexing. Using this model, the alignment of the front and back-end beam-line components can be chosen to minimize the sphere of confusion, improving the overall beam positioning accuracy of the gantry. This alignment can be performed without the beam active, improving the efficiency of installing new systems at customer sites.
40 CFR 63.494 - Back-end process provisions-residual organic HAP and emission limitations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... produced by the emulsion process, polybutadiene rubber and styrene butadiene rubber produced by the... styrene butadiene rubber produced by the emulsion process: (i) A monthly weighted average of 0.40 kg... than a solution or emulsion process, polybutadiene rubber produced by any process other than a solution...
Open ended intelligence: the individuation of intelligent agents
NASA Astrophysics Data System (ADS)
Weinbaum Weaver, David; Veitas, Viktoras
2017-03-01
Artificial general intelligence is a field of research aiming to distil the principles of intelligence that operate independently of a specific problem domain and utilise these principles in order to synthesise systems capable of performing any intellectual task a human being is capable of and beyond. While "narrow" artificial intelligence which focuses on solving specific problems such as speech recognition, text comprehension, visual pattern recognition and robotic motion has shown impressive breakthroughs lately, understanding general intelligence remains elusive. We propose a paradigm shift from intelligence perceived as a competence of individual agents defined in relation to an a priori given problem domain or a goal, to intelligence perceived as a formative process of self-organisation. We call this process open-ended intelligence. Starting with a brief introduction of the current conceptual approach, we expose a number of serious limitations that are traced back to the ontological roots of the concept of intelligence. Open-ended intelligence is then developed as an abstraction of the process of human cognitive development, so its application can be extended to general agents and systems. We introduce and discuss three facets of the idea: the philosophical concept of individuation, sense-making and the individuation of general cognitive agents. We further show how open-ended intelligence can be framed in terms of a distributed, self-organising network of interacting elements and how such process is scalable. The framework highlights an important relation between coordination and intelligence and a new understanding of values.
User Centric Job Monitoring - a redesign and novel approach in the STAR experiment
NASA Astrophysics Data System (ADS)
Arkhipkin, D.; Lauret, J.; Zulkarneeva, Y.
2014-06-01
User Centric Monitoring (or UCM) has been a long awaited feature in STAR, whereas programs, workflows and system "events" could be logged, broadcast and later analyzed. UCM allows to collect and filter available job monitoring information from various resources and present it to users in a user-centric view rather than an administrative-centric point of view. The first attempt and implementation of "a" UCM approach was made in STAR 2004 using a log4cxx plug-in back-end and then further evolved with an attempt to push toward a scalable database back-end (2006) and finally using a Web-Service approach (2010, CSW4DB SBIR). The latest showed to be incomplete and not addressing the evolving needs of the experiment where streamlined messages for online (data acquisition) purposes as well as the continuous support for the data mining needs and event analysis need to coexists and unified in a seamless approach. The code also revealed to be hardly maintainable. This paper presents the next evolutionary step of the UCM toolkit, a redesign and redirection of our latest attempt acknowledging and integrating recent technologies and a simpler, maintainable and yet scalable manner. The extended version of the job logging package is built upon three-tier approach based on Task, Job and Event, and features a Web-Service based logging API, a responsive AJAX-powered user interface, and a database back-end relying on MongoDB, which is uniquely suited for STAR needs. In addition, we present details of integration of this logging package with the STAR offline and online software frameworks. Leveraging on the reported experience and work from the ATLAS and CMS experience on using the ESPER engine, we discuss and show how such approach has been implemented in STAR for meta-data event triggering stream processing and filtering. An ESPER based solution seems to fit well into the online data acquisition system where many systems are monitored.
NASA Astrophysics Data System (ADS)
Naldi, G.; Bartolini, M.; Mattana, A.; Pupillo, G.; Hickish, J.; Foster, G.; Bianchi, G.; Lingua, A.; Monari, J.; Montebugnoli, S.; Perini, F.; Rusticelli, S.; Schiaffino, M.; Virone, G.; Zarb Adami, K.
In radio astronomy Field Programmable Gate Array (FPGA) technology is largely used for the implementation of digital signal processing techniques applied to antenna arrays. This is mainly due to the good trade-off among computing resources, power consumption and cost offered by FPGA chip compared to other technologies like ASIC, GPU and CPU. In the last years several digital backend systems based on such devices have been developed at the Medicina radio astronomical station (INAF-IRA, Bologna, Italy). Instruments like FX correlator, direct imager, beamformer, multi-beam system have been successfully designed and realized on CASPER (Collaboration for Astronomy Signal Processing and Electronics Research, https://casper.berkeley.edu) processing boards. In this paper we present the gained experience in this kind of applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
...-end repository to manage various reporting, pooling, and risk management activities associated with... records is to serve as a central back-end repository to house loan origination and servicing, security...
DataSpread: Unifying Databases and Spreadsheets.
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-08-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current "pane" (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases.
DataSpread: Unifying Databases and Spreadsheets
Bendre, Mangesh; Sun, Bofan; Zhang, Ding; Zhou, Xinyan; Chang, Kevin ChenChuan; Parameswaran, Aditya
2015-01-01
Spreadsheet software is often the tool of choice for ad-hoc tabular data management, processing, and visualization, especially on tiny data sets. On the other hand, relational database systems offer significant power, expressivity, and efficiency over spreadsheet software for data management, while lacking in the ease of use and ad-hoc analysis capabilities. We demonstrate DataSpread, a data exploration tool that holistically unifies databases and spreadsheets. It continues to offer a Microsoft Excel-based spreadsheet front-end, while in parallel managing all the data in a back-end database, specifically, PostgreSQL. DataSpread retains all the advantages of spreadsheets, including ease of use, ad-hoc analysis and visualization capabilities, and a schema-free nature, while also adding the advantages of traditional relational databases, such as scalability and the ability to use arbitrary SQL to import, filter, or join external or internal tables and have the results appear in the spreadsheet. DataSpread needs to reason about and reconcile differences in the notions of schema, addressing of cells and tuples, and the current “pane” (which exists in spreadsheets but not in traditional databases), and support data modifications at both the front-end and the back-end. Our demonstration will center on our first and early prototype of the DataSpread, and will give the attendees a sense for the enormous data exploration capabilities offered by unifying spreadsheets and databases. PMID:26900487
NEP processing, operations, and disposal
NASA Technical Reports Server (NTRS)
Stancati, Mike
1993-01-01
Several recent studies by ASAO/NPO staff members at LeRC and by other organizations have highlighted the potential benefits of using Nuclear Electric Propulsion (NEP) as the primary transportation means for some of the proposed missions of the Space Exploration Initiative. These include the potential to reduce initial mass in orbit and Mars transit time. Modular NEP configurations also introduce fully redundant main propulsion to Mars flight systems adding several abort or fall back options not otherwise available. Recent studies have also identified mission operations, such as on orbital assembly, refurbishment, and reactor disposal, as important discriminators for propulsion system evaluation. This study is intended to identify and assess 'end-to-end' operational issues associated with using NEP for transporting crews and cargo between Earth and Mars. We also include some consideration of lunar cargo transfer as well.
Speech to Text Translation for Malay Language
NASA Astrophysics Data System (ADS)
Al-khulaidi, Rami Ali; Akmeliawati, Rini
2017-11-01
The speech recognition system is a front end and a back-end process that receives an audio signal uttered by a speaker and converts it into a text transcription. The speech system can be used in several fields including: therapeutic technology, education, social robotics and computer entertainments. In most cases in control tasks, which is the purpose of proposing our system, wherein the speed of performance and response concern as the system should integrate with other controlling platforms such as in voiced controlled robots. Therefore, the need for flexible platforms, that can be easily edited to jibe with functionality of the surroundings, came to the scene; unlike other software programs that require recording audios and multiple training for every entry such as MATLAB and Phoenix. In this paper, a speech recognition system for Malay language is implemented using Microsoft Visual Studio C#. 90 (ninety) Malay phrases were tested by 10 (ten) speakers from both genders in different contexts. The result shows that the overall accuracy (calculated from Confusion Matrix) is satisfactory as it is 92.69%.
Real-Time Payload Control and Monitoring on the World Wide Web
NASA Technical Reports Server (NTRS)
Sun, Charles; Windrem, May; Givens, John J. (Technical Monitor)
1998-01-01
World Wide Web (W3) technologies such as the Hypertext Transfer Protocol (HTTP) and the Java object-oriented programming environment offer a powerful, yet relatively inexpensive, framework for distributed application software development. This paper describes the design of a real-time payload control and monitoring system that was developed with W3 technologies at NASA Ames Research Center. Based on Java Development Toolkit (JDK) 1.1, the system uses an event-driven "publish and subscribe" approach to inter-process communication and graphical user-interface construction. A C Language Integrated Production System (CLIPS) compatible inference engine provides the back-end intelligent data processing capability, while Oracle Relational Database Management System (RDBMS) provides the data management function. Preliminary evaluation shows acceptable performance for some classes of payloads, with Java's portability and multimedia support identified as the most significant benefit.
The VLBA correlator: Real-time in the distributed era
NASA Technical Reports Server (NTRS)
Wells, D. C.
1992-01-01
The correlator is the signal processing engine of the Very Long Baseline Array (VLBA). Radio signals are recorded on special wideband (128 Mb/s) digital recorders at the 10 telescopes, with sampling times controlled by hydrogen maser clocks. The magnetic tapes are shipped to the Array Operations Center in Socorro, New Mexico, where they are played back simultaneously into the correlator. Real-time software and firmware controls the playback drives to achieve synchronization, compute models of the wavefront delay, control the numerous modules of the correlator, and record FITS files of the fringe visibilities at the back-end of the correlator. In addition to the more than 3000 custom VLSI chips which handle the massive data flow of the signal processing, the correlator contains a total of more than 100 programmable computers, 8-, 16- and 32-bit CPUs. Code is downloaded into front-end CPU's dependent on operating mode. Low-level code is assembly language, high-level code is C running under a RT OS. We use VxWorks on Motorola MVME147 CPU's. Code development is on a complex of SPARC workstations connected to the RT CPU's by Ethernet. The overall management of the correlation process is dependent on a database management system. We use Ingres running on a Sparcstation-2. We transfer logging information from the database of the VLBA Monitor and Control System to our database using Ingres/NET. Job scripts are computed and are transferred to the real-time computers using NFS, and correlation job execution logs and status flow back by the route. Operator status and control displays use windows on workstations, interfaced to the real-time processes by network protocols. The extensive network protocol support provided by VxWorks is invaluable. The VLBA Correlator's dependence on network protocols is an example of the radical transformation of the real-time world over the past five years. Real-time is becoming more like conventional computing. Paradoxically, 'conventional' computing is also adopting practices from the real-time world: semaphores, shared memory, light-weight threads, and concurrency. This appears to be a convergence of thinking.
Interchangeable whole-body and nose-only exposure system
Cannon, W.C.; Allemann, R.T.; Moss, O.R.; Decker, J.R. Jr.
1992-03-31
An exposure system for experimental animals includes a container for a single animal which has a double wall. The animal is confined within the inner wall. Gaseous material enters a first end, flows over the entire animal, then back between the walls and out the first end. The system also includes an arrangement of valve-controlled manifolds for supplying gaseous material to, and exhausting it from, the containers. 6 figs.
Interchangeable whole-body and nose-only exposure system
Cannon, William C.; Allemann, Rudolph T.; Moss, Owen R.; Decker, Jr., John R.
1992-01-01
An exposure system for experimental animals includes a container for a single animal which has a double wall. The animal is confined within the inner wall. Gaseous material enters a first end, flows over the entire animal, then back between the walls and out the first end. The system also includes an arrangement of valve-controlled manifolds for supplying gaseous material to, and exhausting it from, the containers.
Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L
2012-08-01
Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; p<0.001) and speed (median 12 min vs 23 min; p<0.001) of CLR acknowledgement. It also improved the rate (93.7% vs 84.0%, p<0.001) and speed (median 21 min vs 109 min; p<0.001) of CLR intervention. From the automation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.
40 CFR 63.499 - Back-end process provisions-reporting.
Code of Federal Regulations, 2011 CFR
2011-07-01
... design (i.e., steam-assisted, air-assisted, or non-assisted); all visible emission readings, heat content... specify appropriate reporting and recordkeeping requirements as part of the review of the Precompliance...
40 CFR 63.499 - Back-end process provisions-reporting.
Code of Federal Regulations, 2013 CFR
2013-07-01
... design (i.e., steam-assisted, air-assisted, or non-assisted); all visible emission readings, heat content... specify appropriate reporting and recordkeeping requirements as part of the review of the Precompliance...
40 CFR 63.499 - Back-end process provisions-reporting.
Code of Federal Regulations, 2012 CFR
2012-07-01
... design (i.e., steam-assisted, air-assisted, or non-assisted); all visible emission readings, heat content... specify appropriate reporting and recordkeeping requirements as part of the review of the Precompliance...
40 CFR 63.499 - Back-end process provisions-reporting.
Code of Federal Regulations, 2014 CFR
2014-07-01
... design (i.e., steam-assisted, air-assisted, or non-assisted); all visible emission readings, heat content... specify appropriate reporting and recordkeeping requirements as part of the review of the Precompliance...
4. EXTERIOR OF SOUTH END OF BUILDING 108 SHOWING STORM ...
4. EXTERIOR OF SOUTH END OF BUILDING 108 SHOWING STORM PORCH ADDITION AND WINDOWS ALONG BACK (WEST SIDE) OF HOUSE. NOTE ORIGNAL SHORT CHIMNEY AT CREST OF ROOF. VIEW TO NORTH. - Rush Creek Hydroelectric System, Clubhouse Cottage, Rush Creek, June Lake, Mono County, CA
Sensor mount assemblies and sensor assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David H
2012-04-10
Sensor mount assemblies and sensor assemblies are provided. In an embodiment, by way of example only, a sensor mount assembly includes a busbar, a main body, a backing surface, and a first finger. The busbar has a first end and a second end. The main body is overmolded onto the busbar. The backing surface extends radially outwardly relative to the main body. The first finger extends axially from the backing surface, and the first finger has a first end, a second end, and a tooth. The first end of the first finger is disposed on the backing surface, and themore » tooth is formed on the second end of the first finger.« less
40 CFR 63.499 - Back-end process provisions-reporting.
Code of Federal Regulations, 2010 CFR
2010-07-01
... heater. (i) The flare design (i.e., steam-assisted, air-assisted, or non-assisted); all visible emission... Administrator will specify appropriate reporting and recordkeeping requirements as part of the review of the...
New architecture for utility scale electricity from concentrator photovoltaics
NASA Astrophysics Data System (ADS)
Angel, Roger; Connors, Thomas; Davison, Warren; Olbert, Blain; Sivanandam, Suresh
2010-08-01
The paper describes a new system architecture optimized for utility-scale generation with concentrating photovoltaic cells (CPV) at fossil fuel price. We report on-sun tests of the architecture and development at the University of Arizona of the manufacturing processes adapted for high volume production. The new system takes advantage of triple-junction cells to convert concentrated sunlight into electricity. These commercially available cells have twice the conversion efficiency of silicon panels (40%) and one-tenth the cost per watt, when used at 1000x concentration. Telescope technology is adapted to deliver concentrated light to the cells at minimum cost. The architecture combines three novel elements: large (3.1 m x 3.1 m square) dish reflectors made as back-silvered glass monoliths; 2.5 kW receivers at each dish focus, each one incorporating a spherical field lens to deliver uniform illumination to multiple cells; and a lightweight steel spaceframe structure to hold multiple dish/receiver units in coalignment and oriented to the sun. Development of the process for replicating single-piece reflector dishes is well advanced at the Steward Observatory Mirror Lab. End-to-end system tests have been completed with single cells. A lightweight steel spaceframe to hold and track eight dish/receiver units to generate 20 kW has been completed. A single 2.5 kW receiver is presently under construction, and is expected to be operated in an end-to-end on-sun test with a monolithic dish before the end of 2010. The University of Arizona has granted an exclusive license to REhnu, LLC to commercialize this technology.
Guermandi, Marco; Bigucci, Alessandro; Franchi Scarselli, Eleonora; Guerrieri, Roberto
2015-01-01
We present a system for the acquisition of EEG signals based on active electrodes and implementing a Driving Right Leg circuit (DgRL). DgRL allows for single-ended amplification and analog-to-digital conversion, still guaranteeing a common mode rejection in excess of 110 dB. This allows the system to acquire high-quality EEG signals essentially removing network interference for both wet and dry-contact electrodes. The front-end amplification stage is integrated on the electrode, minimizing the system's sensitivity to electrode contact quality, cable movement and common mode interference. The A/D conversion stage can be either integrated in the remote back-end or placed on the head as well, allowing for an all-digital communication to the back-end. Noise integrated in the band from 0.5 to 100 Hz is comprised between 0.62 and 1.3 μV, depending on the configuration. Current consumption for the amplification and A/D conversion of one channel is 390 μA. Thanks to its low noise, the high level of interference suppression and its quick setup capabilities, the system is particularly suitable for use outside clinical environments, such as in home care, brain-computer interfaces or consumer-oriented applications.
Orbital friction stir weld system
NASA Technical Reports Server (NTRS)
Ding, R. Jeffrey (Inventor); Carter, Robert W. (Inventor)
2001-01-01
This invention is an apparatus for joining the ends of two cylindrical (i.e., pipe-shaped) sections together with a friction stir weld. The apparatus holds the two cylindrical sections together and provides back-side weld support as it makes a friction stir weld around the circumference of the joined ends.
System for stabilizing cable phase delay utilizing a coaxial cable under pressure
NASA Technical Reports Server (NTRS)
Clements, P. A. (Inventor)
1974-01-01
Stabilizing the phase delay of signals passing through a pressurizable coaxial cable is disclosed. Signals from an appropriate source at a selected frequency, e.g., 100 MHz, are sent through the controlled cable from a first cable end to a second cable end which, electrically, is open or heavily mismatched at 100 MHz, thereby reflecting 100 MHz signals back to the first cable end. Thereat, the phase difference between the reflected-back signals and the signals from the source is detected by a phase detector. The output of the latter is used to control the flow of gas to or from the cable, thereby controlling the cable pressure, which in turn affects the cable phase delay.
40 CFR 63.494 - Back-end process provisions-residual organic HAP and emission limitations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... producing butyl rubber, epichlorohydrin elastomer, neoprene, and nitrile butadiene rubber shall not exceed... processes at affected sources producing butyl rubber, epichlorohydrin elastomer, neoprene, and nitrile... submitted in accordance with § 63.499(f)(1). (i) For butyl rubber, the organic HAP emission limitation shall...
Polarization-insensitive techniques for optical signal processing
NASA Astrophysics Data System (ADS)
Salem, Reza
2006-12-01
This thesis investigates polarization-insensitive methods for optical signal processing. Two signal processing techniques are studied: clock recovery based on two-photon absorption in silicon and demultiplexing based on cross-phase modulation in highly nonlinear fiber. The clock recovery system is tested at an 80 Gb/s data rate for both back-to-back and transmission experiments. The demultiplexer is tested at a 160 Gb/s data rate in a back-to-back experiment. We experimentally demonstrate methods for eliminating polarization dependence in both systems. Our experimental results are confirmed by theoretical and numerical analysis.
NASA Astrophysics Data System (ADS)
Diehl, T.; Waldhauser, F.; Schaff, D. P.; Engdahl, E. R.
2009-12-01
The Andaman Sea region in the Northeast Indian Ocean is characterized by a complex extensional back-arc basin, which connects the Sumatra Fault System in the south with the Sagaing fault in the north. The Andaman back-arc is generally classified as a convergent pull-apart basin (leaky-transform) rather than a typical extensional back-arc basin. Oblique subduction of the Indian-Australian plate results in strike-slip faulting parallel to the trench axis, formation of a sliver plate and back-arc pull-apart extension. Active spreading occurs predominately along a NE-SW oriented ridge-segment bisecting the Central Andaman basin at the SW end of the back-arc. Existing models of the Andaman back-arc system are mainly derived from bathymetry maps, seismic surveys, magnetic anomalies, and seismotectonic analysis. The latter are typically based on global bulletin locations provided by the NEIC or ISC. These bulletin locations, however, usually have low spatial resolution (especially in focal depth), which hampers a detailed seismotectonic interpretation. In order to better study the seismotectonic processes of the Andaman Sea region, specifically its role during the recent 2004 M9.3 earthquake, we improve on existing hypocenter locations by apply the double-difference algorithm to regional and teleseismic data. Differential times used for the relocation process are computed from phase picks listed in the ISC and NEIC bulletins, and from cross-correlating regional and teleseismic waveforms. EHB hypocenter solutions are used as reference locations to improve the initial locations in the ISC/NEIC catalog during double-difference processing. The final DD solutions show significantly reduced scatter in event locations along the back arc ridge. The various observed focal mechanisms tend to cluster by type and, in addition, the structure and orientation of individual clusters are generally consistent with available CMT solutions for individual events and reveal the detailed distribution of predominantly normal, strike slip, and dip slip faulting associated with the extensional tectonics that dominate the Andaman Sea. The refined plate boundary, together with recent high-resolution bathymetry and seismic-survey data in the Central Andaman basin, are interpreted with respect to the dynamics and evolution of the back arc system. A spatio-temporal analysis of the two largest swarms (NE of Nicobar Islands in January 2005 and in the Central basin in March 2006) shows that events align along NE-SW oriented structures, with events migrating in time from NE to SW in both swarms. The SW propagation of seismogenic faults may indicate magmatic intrusion or spreading events that originate from sources that locate northeast of the swarms. The detailed analysis of the geometry and temporal evolution of these swarms allow for improved estimates of the regional stress field of the back-arc system and a better understanding of its dynamic behaviour following the December 2004 Mw 9.3 earthquake.
Determination of end point of primary drying in freeze-drying process control.
Patel, Sajal M; Doen, Takayuki; Pikal, Michael J
2010-03-01
Freeze-drying is a relatively expensive process requiring long processing time, and hence one of the key objectives during freeze-drying process development is to minimize the primary drying time, which is the longest of the three steps in freeze-drying. However, increasing the shelf temperature into secondary drying before all of the ice is removed from the product will likely cause collapse or eutectic melt. Thus, from product quality as well as process economics standpoint, it is very critical to detect the end of primary drying. Experiments were conducted with 5% mannitol and 5% sucrose as model systems. The apparent end point of primary drying was determined by comparative pressure measurement (i.e., Pirani vs. MKS Baratron), dew point, Lyotrack (gas plasma spectroscopy), water concentration from tunable diode laser absorption spectroscopy, condenser pressure, pressure rise test (manometric temperature measurement or variations of this method), and product thermocouples. Vials were pulled out from the drying chamber using a sample thief during late primary and early secondary drying to determine percent residual moisture either gravimetrically or by Karl Fischer, and the cake structure was determined visually for melt-back, collapse, and retention of cake structure at the apparent end point of primary drying (i.e., onset, midpoint, and offset). By far, the Pirani is the best choice of the methods tested for evaluation of the end point of primary drying. Also, it is a batch technique, which is cheap, steam sterilizable, and easy to install without requiring any modification to the existing dryer.
Fuzzy Logic Based Autonomous Parallel Parking System with Kalman Filtering
NASA Astrophysics Data System (ADS)
Panomruttanarug, Benjamas; Higuchi, Kohji
This paper presents an emulation of fuzzy logic control schemes for an autonomous parallel parking system in a backward maneuver. There are four infrared sensors sending the distance data to a microcontroller for generating an obstacle-free parking path. Two of them mounted on the front and rear wheels on the parking side are used as the inputs to the fuzzy rules to calculate a proper steering angle while backing. The other two attached to the front and rear ends serve for avoiding collision with other cars along the parking space. At the end of parking processes, the vehicle will be in line with other parked cars and positioned in the middle of the free space. Fuzzy rules are designed based upon a wall following process. Performance of the infrared sensors is improved using Kalman filtering. The design method needs extra information from ultrasonic sensors. Starting from modeling the ultrasonic sensor in 1-D state space forms, one makes use of the infrared sensor as a measurement to update the predicted values. Experimental results demonstrate the effectiveness of sensor improvement.
Catalytic Ignition and Upstream Reaction Propagation in Monolith Reactors
NASA Technical Reports Server (NTRS)
Struk, Peter M.; Dietrich, Daniel L.; Miller, Fletcher J.; T'ien, James S.
2007-01-01
Using numerical simulations, this work demonstrates a concept called back-end ignition for lighting-off and pre-heating a catalytic monolith in a power generation system. In this concept, a downstream heat source (e.g. a flame) or resistive heating in the downstream portion of the monolith initiates a localized catalytic reaction which subsequently propagates upstream and heats the entire monolith. The simulations used a transient numerical model of a single catalytic channel which characterizes the behavior of the entire monolith. The model treats both the gas and solid phases and includes detailed homogeneous and heterogeneous reactions. An important parameter in the model for back-end ignition is upstream heat conduction along the solid. The simulations used both dry and wet CO chemistry as a model fuel for the proof-of-concept calculations; the presence of water vapor can trigger homogenous reactions, provided that gas-phase temperatures are adequately high and there is sufficient fuel remaining after surface reactions. With sufficiently high inlet equivalence ratio, back-end ignition occurs using the thermophysical properties of both a ceramic and metal monolith (coated with platinum in both cases), with the heat-up times significantly faster for the metal monolith. For lower equivalence ratios, back-end ignition occurs without upstream propagation. Once light-off and propagation occur, the inlet equivalence ratio could be reduced significantly while still maintaining an ignited monolith as demonstrated by calculations using complete monolith heating.
Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models.
Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A
2014-01-01
Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients.
Virtual Systems Pharmacology (ViSP) software for simulation from mechanistic systems-level models
Ermakov, Sergey; Forster, Peter; Pagidala, Jyotsna; Miladinov, Marko; Wang, Albert; Baillie, Rebecca; Bartlett, Derek; Reed, Mike; Leil, Tarek A.
2014-01-01
Multiple software programs are available for designing and running large scale system-level pharmacology models used in the drug development process. Depending on the problem, scientists may be forced to use several modeling tools that could increase model development time, IT costs and so on. Therefore, it is desirable to have a single platform that allows setting up and running large-scale simulations for the models that have been developed with different modeling tools. We developed a workflow and a software platform in which a model file is compiled into a self-contained executable that is no longer dependent on the software that was used to create the model. At the same time the full model specifics is preserved by presenting all model parameters as input parameters for the executable. This platform was implemented as a model agnostic, therapeutic area agnostic and web-based application with a database back-end that can be used to configure, manage and execute large-scale simulations for multiple models by multiple users. The user interface is designed to be easily configurable to reflect the specifics of the model and the user's particular needs and the back-end database has been implemented to store and manage all aspects of the systems, such as Models, Virtual Patients, User Interface Settings, and Results. The platform can be adapted and deployed on an existing cluster or cloud computing environment. Its use was demonstrated with a metabolic disease systems pharmacology model that simulates the effects of two antidiabetic drugs, metformin and fasiglifam, in type 2 diabetes mellitus patients. PMID:25374542
A practical approach for inexpensive searches of radiology report databases.
Desjardins, Benoit; Hamilton, R Curtis
2007-06-01
We present a method to perform full text searches of radiology reports for the large number of departments that do not have this ability as part of their radiology or hospital information system. A tool written in Microsoft Access (front-end) has been designed to search a server (back-end) containing the indexed backup weekly copy of the full relational database extracted from a radiology information system (RIS). This front end-/back-end approach has been implemented in a large academic radiology department, and is used for teaching, research and administrative purposes. The weekly second backup of the 80 GB, 4 million record RIS database takes 2 hours. Further indexing of the exported radiology reports takes 6 hours. Individual searches of the indexed database typically take less than 1 minute on the indexed database and 30-60 minutes on the nonindexed database. Guidelines to properly address privacy and institutional review board issues are closely followed by all users. This method has potential to improve teaching, research, and administrative programs within radiology departments that cannot afford more expensive technology.
OECD/NEA Ongoing activities related to the nuclear fuel cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cornet, S.M.; McCarthy, K.; Chauvin, N.
2013-07-01
As part of its role in encouraging international collaboration, the OECD Nuclear Energy Agency is coordinating a series of projects related to the Nuclear Fuel Cycle. The Nuclear Science Committee (NSC) Working Party on Scientific Issues of the Nuclear Fuel Cycle (WPFC) comprises five different expert groups covering all aspects of the fuel cycle from front to back-end. Activities related to fuels, materials, physics, separation chemistry, and fuel cycles scenarios are being undertaken. By publishing state-of-the-art reports and organizing workshops, the groups are able to disseminate recent research advancements to the international community. Current activities mainly focus on advanced nuclearmore » systems, and experts are working on analyzing results and establishing challenges associated to the adoption of new materials and fuels. By comparing different codes, the Expert Group on Advanced Fuel Cycle Scenarios is aiming at gaining further understanding of the scientific issues and specific national needs associated with the implementation of advanced fuel cycles. At the back end of the fuel cycle, separation technologies (aqueous and pyrochemical processing) are being assessed. Current and future activities comprise studies on minor actinides separation and post Fukushima studies. Regular workshops are also organized to discuss recent developments on Partitioning and Transmutation. In addition, the Nuclear Development Committee (NDC) focuses on the analysis of the economics of nuclear power across the fuel cycle in the context of changes of electricity markets, social acceptance and technological advances and assesses the availability of the nuclear fuel and infrastructure required for the deployment of existing and future nuclear power. The Expert Group on the Economics of the Back End of the Nuclear Fuel Cycle (EBENFC), in particular, is looking at assessing economic and financial issues related to the long term management of spent nuclear fuel. (authors)« less
Solid State Lighting Program (Falcon)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meeks, Steven
2012-06-30
Over the past two years, KLA-Tencor and partners successfully developed and deployed software and hardware tools that increase product yield for High Brightness LED (HBLED) manufacturing and reduce product development and factory ramp times. This report summarizes our development effort and details of how the results of the Solid State Light Program (Falcon) have started to help HBLED manufacturers optimize process control by enabling them to flag and correct identified killer defect conditions at any point of origin in the process manufacturing flow. This constitutes a quantum leap in yield management over current practice. Current practice consists of die dispositioningmore » which is just rejection of bad die at end of process based upon probe tests, loosely assisted by optical in-line monitoring for gross process deficiencies. For the first time, and as a result of our Solid State Lighting Program, our LED manufacturing partners have obtained the software and hardware tools that optimize individual process steps to control killer defects at the point in the processes where they originate. Products developed during our two year program enable optimized inspection strategies for many product lines to minimize cost and maximize yield. The Solid State Lighting Program was structured in three phases: i) the development of advanced imaging modes that achieve clear separation between LED defect types, improves signal to noise and scan rates, and minimizes nuisance defects for both front end and back end inspection tools, ii) the creation of defect source analysis (DSA) software that connect the defect maps from back-end and front-end HBLED manufacturing tools to permit the automatic overlay and traceability of defects between tools and process steps, suppress nuisance defects, and identify the origin of killer defects with process step and conditions, and iii) working with partners (Philips Lumileds) on product wafers, obtain a detailed statistical correlation of automated defect and DSA map overlay to failed die identified using end product probe test results. Results from our two year effort have led to “automated end-to-end defect detection” with full defect traceability and the ability to unambiguously correlate device killer defects to optically detected features and their point of origin within the process. Success of the program can be measured by yield improvements at our partner’s facilities and new product orders.« less
Code of Federal Regulations, 2012 CFR
2012-07-01
... methods of determining this quantity are production records, measurement of stream characteristics, and... HAP (or TOC, minus methane and ethane) emissions in all process vent streams and primary and secondary... heater. (B) Paragraph (b)(5)(iii) of this section is applicable, except that TOC (minus methane and...
Code of Federal Regulations, 2014 CFR
2014-07-01
... methods of determining this quantity are production records, measurement of stream characteristics, and... HAP (or TOC, minus methane and ethane) emissions in all process vent streams and primary and secondary... heater. (B) Paragraph (b)(5)(iii) of this section is applicable, except that TOC (minus methane and...
Code of Federal Regulations, 2011 CFR
2011-07-01
... methods of determining this quantity are production records, measurement of stream characteristics, and... HAP (or TOC, minus methane and ethane) emissions in all process vent streams and primary and secondary... heater. (B) Paragraph (b)(5)(iii) of this section is applicable, except that TOC (minus methane and...
Code of Federal Regulations, 2013 CFR
2013-07-01
... methods of determining this quantity are production records, measurement of stream characteristics, and... HAP (or TOC, minus methane and ethane) emissions in all process vent streams and primary and secondary... heater. (B) Paragraph (b)(5)(iii) of this section is applicable, except that TOC (minus methane and...
Code of Federal Regulations, 2010 CFR
2010-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Code of Federal Regulations, 2013 CFR
2013-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Code of Federal Regulations, 2012 CFR
2012-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Code of Federal Regulations, 2011 CFR
2011-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
40 CFR 63.498 - Back-end process provisions-recordkeeping.
Code of Federal Regulations, 2010 CFR
2010-07-01
... be the crumb rubber dry weight of the rubber leaving the stripper. (iv) The organic HAP content of... stripper. (B) For solution processes, this quantity shall be the crumb rubber dry weight of the crumb rubber leaving the stripper. (iii) The hourly average of all stripper parameter results; (iv) If one or...
PALM-3000: EXOPLANET ADAPTIVE OPTICS FOR THE 5 m HALE TELESCOPE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dekany, Richard; Bouchez, Antonin; Baranec, Christoph
2013-10-20
We describe and report first results from PALM-3000, the second-generation astronomical adaptive optics (AO) facility for the 5.1 m Hale telescope at Palomar Observatory. PALM-3000 has been engineered for high-contrast imaging and emission spectroscopy of brown dwarfs and large planetary mass bodies at near-infrared wavelengths around bright stars, but also supports general natural guide star use to V ≈ 17. Using its unique 66 × 66 actuator deformable mirror, PALM-3000 has thus far demonstrated residual wavefront errors of 141 nm rms under ∼1'' seeing conditions. PALM-3000 can provide phase conjugation correction over a 6.''4 × 6.''4 working region at λmore » = 2.2 μm, or full electric field (amplitude and phase) correction over approximately one-half of this field. With optimized back-end instrumentation, PALM-3000 is designed to enable 10{sup –7} contrast at 1'' angular separation, including post-observation speckle suppression processing. While continued optimization of the AO system is ongoing, we have already successfully commissioned five back-end instruments and begun a major exoplanet characterization survey, Project 1640.« less
NASA Astrophysics Data System (ADS)
Smuga-Otto, M. J.; Garcia, R. K.; Knuteson, R. O.; Martin, G. D.; Flynn, B. M.; Hackel, D.
2006-12-01
The University of Wisconsin-Madison Space Science and Engineering Center (UW-SSEC) is developing tools to help scientists realize the potential of high spectral resolution instruments for atmospheric science. Upcoming satellite spectrometers like the Cross-track Infrared Sounder (CrIS), experimental instruments like the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and proposed instruments like the Hyperspectral Environmental Suite (HES) within the GOES-R project will present a challenge in the form of the overwhelmingly large amounts of continuously generated data. Current and near-future workstations will have neither the storage space nor computational capacity to cope with raw spectral data spanning more than a few minutes of observations from these instruments. Schemes exist for processing raw data from hyperspectral instruments currently in testing, that involve distributed computation across clusters. Data, which for an instrument like GIFTS can amount to over 1.5 Terabytes per day, is carefully managed on Storage Area Networks (SANs), with attention paid to proper maintenance of associated metadata. The UW-SSEC is preparing a demonstration integrating these back-end capabilities as part of a larger visualization framework, to assist scientists in developing new products from high spectral data, sourcing data volumes they could not otherwise manage. This demonstration focuses on managing storage so that only the data specifically needed for the desired product are pulled from the SAN, and on running computationally expensive intermediate processing on a back-end cluster, with the final product being sent to a visualization system on the scientist's workstation. Where possible, existing software and solutions are used to reduce cost of development. The heart of the computing component is the GIFTS Information Processing System (GIPS), developed at the UW- SSEC to allow distribution of processing tasks such as conversion of raw GIFTS interferograms into calibrated radiance spectra, and retrieving temperature and water vapor content atmospheric profiles from these spectra. The hope is that by demonstrating the capabilities afforded by a composite system like the one described here, scientists can be convinced to contribute further algorithms in support of this model of computing and visualization.
Dicentric breakage at telomere fusions
Pobiega, Sabrina; Marcand, Stéphane
2010-01-01
Nonhomologous end-joining (NHEJ) inhibition at telomeres ensures that native chromosome ends do not fuse together. But the occurrence and consequences of rare telomere fusions are not well understood. It is notably unclear whether a telomere fusion could be processed to restore telomere ends. Here we address the behavior of individual dicentrics formed by telomere fusion in the yeast Saccharomyces cerevisiae. Our approach was to first stabilize and amplify fusions between two chromosomes by temporarily inactivating one centromere. Next we analyzed dicentric breakage following centromere reactivation. Unexpectedly, dicentrics often break at the telomere fusions during progression through mitosis, a process that restores the parental chromosomes. This unforeseen result suggests a rescue pathway able to process telomere fusions and to back up NHEJ inhibition at telomeres. PMID:20360388
Xie, Fagen; Lee, Janet; Munoz-Plaza, Corrine E; Hahn, Erin E; Chen, Wansu
2017-01-01
Surgical pathology reports (SPR) contain rich clinical diagnosis information. The text information extraction system (TIES) is an end-to-end application leveraging natural language processing technologies and focused on the processing of pathology and/or radiology reports. We deployed the TIES system and integrated SPRs into the TIES system on a daily basis at Kaiser Permanente Southern California. The breast cancer cases diagnosed in December 2013 from the Cancer Registry (CANREG) were used to validate the performance of the TIES system. The National Cancer Institute Metathesaurus (NCIM) concept terms and codes to describe breast cancer were identified through the Unified Medical Language System Terminology Service (UTS) application. The identified NCIM codes were used to search for the coded SPRs in the back-end datastore directly. The identified cases were then compared with the breast cancer patients pulled from CANREG. A total of 437 breast cancer concept terms and 14 combinations of "breast"and "cancer" terms were identified from the UTS application. A total of 249 breast cancer cases diagnosed in December 2013 was pulled from CANREG. Out of these 249 cases, 241 were successfully identified by the TIES system from a total of 457 reports. The TIES system also identified an additional 277 cases that were not part of the validation sample. Out of the 277 cases, 11% were determined as highly likely to be cases after manual examinations, and 86% were in CANREG but were diagnosed in months other than December of 2013. The study demonstrated that the TIES system can effectively identify potential breast cancer cases in our care setting. Identified potential cases can be easily confirmed by reviewing the corresponding annotated reports through the front-end visualization interface. The TIES system is a great tool for identifying potential various cancer cases in a timely manner and on a regular basis in support of clinical research studies.
A miniature bidirectional telemetry system for in-vivo gastric slow wave recordings
Farajidavar, Aydin; O’Grady, Gregory; Rao, Smitha M.N.; Cheng, Leo K; Abell, Thomas; Chiao, J.-C.
2012-01-01
Stomach contractions are initiated and coordinated by an underlying electrical activity (slow waves), and electrical dysrhythmias accompany motility diseases. Electrical recordings taken directly from the stomach provide the most valuable data, but face technical constraints. Serosal or mucosal electrodes have cables that traverse the abdominal wall, or a natural orifice, causing discomfort and possible infection, and restricting mobility. These problems motivated the development of a wireless system. The bidirectional telemetric system constitutes a front-end transponder, a back-end receiver and a graphical user interface. The front-end module conditions the analog signals, then digitizes and loads the data into a radio for transmission. Data receipt at the back-end is acknowledged via a transceiver function. The system was validated in a bench-top study, then validated in-vivo using serosal electrodes connected simultaneously to a commercial wired system. The front-end module was 35×35×27 mm3 and weighed 20 g. Bench-top tests demonstrated reliable communication within a distance range of 30 m, power consumption of 13.5 mW, and 124-hour operation when utilizing a 560-mAh, 3-V battery. In-vivo, slow wave frequencies were recorded identically with the wireless and wired reference systems (2.4 cycles/min), automated activation time detection was modestly better for the wireless system (5% vs 14% false positive rate), and signal amplitudes were modestly higher via the wireless system (462 vs 386 μV; p<0.001). This telemetric system for slow wave acquisition is reliable, power efficient, readily portable and potentially implantable. The device will enable chronic monitoring and evaluation of slow wave patterns in animals and patients. PMID:22635054
Equalizer design techniques for dispersive cables with application to the SPS wideband kicker
NASA Astrophysics Data System (ADS)
Platt, Jason; Hofle, Wolfgang; Pollock, Kristin; Fox, John
2017-10-01
A wide-band vertical instability feedback control system in development at CERN requires 1-1.5 GHz of bandwidth for the entire processing chain, from the beam pickups through the feedback signal digital processing to the back-end power amplifiers and kicker structures. Dispersive effects in cables, amplifiers, pickup and kicker elements can result in distortions in the time domain signal as it proceeds through the processing system, and deviations from linear phase response reduce the allowable bandwidth for the closed-loop feedback system. We have developed an equalizer analog circuit that compensates for these dispersive effects. Here we present a design technique for the construction of an analog equalizer that incorporates the effect of parasitic circuit elements in the equalizer to increase the fidelity of the implemented equalizer. Finally, we show results from the measurement of an assembled backend equalizer that corrects for dispersive elements in the cables over a bandwidth of 10-1000 MHz.
77 FR 60651 - Airworthiness Directives; BAE Systems (Operations) Limited Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... of the wing leading edge. This proposed AD would require a detailed inspection of the end caps on the... tube, and ice accretion on the wing leading edge or run-back ice, which could lead to a reduction in... leading edge anti- icing piccolo tube end caps on two aircraft. This was discovered during routine zonal...
4. EXTERIOR OF SOUTH END OF BUILDING 104 SHOWING 1LIGHT ...
4. EXTERIOR OF SOUTH END OF BUILDING 104 SHOWING 1-LIGHT SIDE EXIT DOOR AND ORIGINAL WOOD-FRAMED SLIDING GLASS KITCHEN WINDOWS AT PHOTO CENTER, AND TALL RUSTIC STYLE CHIMNEY WITH GABLE FRAME ON BACK WALL OF HOUSE. VIEW TO NORTHEAST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA
78 FR 7259 - Airworthiness Directives; BAE SYSTEMS (OPERATIONS) LIMITED Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... wing leading edge. This AD requires a detailed inspection of the end caps on the anti-icing piccolo... on the wing leading edge or run-back ice, which could lead to a reduction in the stall margin on... the loss of the wing leading edge anti- icing piccolo tube end caps on two aircraft. This was...
Characteristic of a Digital Correlation Radiometer Back End with Finite Wordlength
NASA Technical Reports Server (NTRS)
Biswas, Sayak K.; Hyde, David W.; James, Mark W.; Cecil, Daniel J.
2017-01-01
The performance characteristic of a digital correlation radiometer signal processing back end (DBE) is analyzed using a simulator. The particular design studied here corresponds to the airborne Hurricane Imaging radiometer which was jointly developed by the NASA Marshall Space Flight Center, University of Michigan, University of Central Florida and NOAA. Laboratory and flight test data is found to be in accord with the simulation results. Overall design seems to be optimum for the typical input signal dynamic range. It was found that the performance of the digital kurtosis could be improved by lowering the DBE input power level. An unusual scaling between digital correlation channels observed in the instrument data is confirmed to be a DBE characteristic.
European Space Software Repository ESSR
NASA Astrophysics Data System (ADS)
Livschitz, Jakob; Blommestijn, Robert
2016-08-01
The paper and presentation will present the status of the ESSR (European Space Software Repository), see [1]. It will describe the development phases, outline the web portal functionality and explain the process steps behind. Not only the front-end but also the back-end will be discussed.The ESSR web portal went live ESA internal on May 15th, 2015 and live world-wide September 19th, 2015. Currently the ESSR is in operations.
The ATLAS Public Web Pages: Online Management of HEP External Communication Content
NASA Astrophysics Data System (ADS)
Goldfarb, S.; Marcelloni, C.; Eli Phoboo, A.; Shaw, K.
2015-12-01
The ATLAS Education and Outreach Group is in the process of migrating its public online content to a professionally designed set of web pages built on the Drupal [1] content management system. Development of the front-end design passed through several key stages, including audience surveys, stakeholder interviews, usage analytics, and a series of fast design iterations, called sprints. Implementation of the web site involves application of the html design using Drupal templates, refined development iterations, and the overall population of the site with content. We present the design and development processes and share the lessons learned along the way, including the results of the data-driven discovery studies. We also demonstrate the advantages of selecting a back-end supported by content management, with a focus on workflow. Finally, we discuss usage of the new public web pages to implement outreach strategy through implementation of clearly presented themes, consistent audience targeting and messaging, and the enforcement of a well-defined visual identity.
Code of Federal Regulations, 2014 CFR
2014-07-01
... accepted chemical engineering principles, measurable process parameters, or physical or chemical laws or... engineering assessment, as described in paragraph (c)(2) of this section. (1) The owner or operator may choose... sample run. (2) The owner or operator may use engineering assessment to demonstrate compliance with the...
Lok, U-Wai; Li, Pai-Chi
2016-03-01
Graphics processing unit (GPU)-based software beamforming has advantages over hardware-based beamforming of easier programmability and a faster design cycle, since complicated imaging algorithms can be efficiently programmed and modified. However, the need for a high data rate when transferring ultrasound radio-frequency (RF) data from the hardware front end to the software back end limits the real-time performance. Data compression methods can be applied to the hardware front end to mitigate the data transfer issue. Nevertheless, most decompression processes cannot be performed efficiently on a GPU, thus becoming another bottleneck of the real-time imaging. Moreover, lossless (or nearly lossless) compression is desirable to avoid image quality degradation. In a previous study, we proposed a real-time lossless compression-decompression algorithm and demonstrated that it can reduce the overall processing time because the reduction in data transfer time is greater than the computation time required for compression/decompression. This paper analyzes the lossless compression method in order to understand the factors limiting the compression efficiency. Based on the analytical results, a nearly lossless compression is proposed to further enhance the compression efficiency. The proposed method comprises a transformation coding method involving modified lossless compression that aims at suppressing amplitude data. The simulation results indicate that the compression ratio (CR) of the proposed approach can be enhanced from nearly 1.8 to 2.5, thus allowing a higher data acquisition rate at the front end. The spatial and contrast resolutions with and without compression were almost identical, and the process of decompressing the data of a single frame on a GPU took only several milliseconds. Moreover, the proposed method has been implemented in a 64-channel system that we built in-house to demonstrate the feasibility of the proposed algorithm in a real system. It was found that channel data from a 64-channel system can be transferred using the standard USB 3.0 interface in most practical imaging applications.
Teaching with a Dual-Channel Classroom Feedback System in the Digital Classroom Environment
ERIC Educational Resources Information Center
Yu, Yuan-Chih
2017-01-01
Teaching with a classroom feedback system can benefit both teaching and learning practices of interactivity. In this paper, we propose a dual-channel classroom feedback system integrated with a back-end e-Learning system. The system consists of learning agents running on the students' computers and a teaching agent running on the instructor's…
Wong, Jessica J; McGregor, Marion; Mior, Silvano A; Loisel, Patrick
2014-01-01
The purpose of this study was to develop a model that evaluates the impact of policy changes on the number of workers' compensation lost-time back claims in Ontario, Canada, over a 30-year timeframe. The model was used to test the hypothesis that a theory- and policy-driven model would be sufficient in reproducing historical claims data in a robust manner and that policy changes would have a major impact on modeled data. The model was developed using system dynamics methods in the Vensim simulation program. The theoretical effects of policies for compensation benefit levels and experience rating fees were modeled. The model was built and validated using historical claims data from 1980 to 2009. Sensitivity analysis was used to evaluate the modeled data at extreme end points of variable input and timeframes. The degree of predictive value of the modeled data was measured by the coefficient of determination, root mean square error, and Theil's inequality coefficients. Correlation between modeled data and actual data was found to be meaningful (R(2) = 0.934), and the modeled data were stable at extreme end points. Among the effects explored, policy changes were found to be relatively minor drivers of back claims data, accounting for a 13% improvement in error. Simulation results suggested that unemployment, number of no-lost-time claims, number of injuries per worker, and recovery rate from back injuries outside of claims management to be sensitive drivers of back claims data. A robust systems-based model was developed and tested for use in future policy research in Ontario's workers' compensation. The study findings suggest that certain areas within and outside the workers' compensation system need to be considered when evaluating and changing policies around back claims. © 2014. Published by National University of Health Sciences All rights reserved.
Algorithm for fast event parameters estimation on GEM acquired data
NASA Astrophysics Data System (ADS)
Linczuk, Paweł; Krawczyk, Rafał D.; Poźniak, Krzysztof T.; Kasprowicz, Grzegorz; Wojeński, Andrzej; Chernyshova, Maryna; Czarski, Tomasz
2016-09-01
We present study of a software-hardware environment for developing fast computation with high throughput and low latency methods, which can be used as back-end in High Energy Physics (HEP) and other High Performance Computing (HPC) systems, based on high amount of input from electronic sensor based front-end. There is a parallelization possibilities discussion and testing on Intel HPC solutions with consideration of applications with Gas Electron Multiplier (GEM) measurement systems presented in this paper.
An open-source data storage and visualization back end for experimental data.
Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert; Nielsen, Jane H; Chorkendorff, Ib
2014-04-01
In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high resilience to equipment failure, whereas the central storage of data dramatically eases backup and data exchange. The visualization front end allows direct monitoring of acquired data to see live progress of long-duration experiments. This enables the user to alter experimental conditions based on these data and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status of long-duration experiments, and implementation of instant alarms in the event of failure.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
A front-end wafer-level microsystem packaging technique with micro-cap array
NASA Astrophysics Data System (ADS)
Chiang, Yuh-Min
2002-09-01
The back-end packaging process is the remaining challenge for the micromachining industry to commercialize microsystem technology (MST) devices at low cost. This dissertation presents a novel wafer level protection technique as a final step of the front-end fabrication process for MSTs. It facilitates improved manufacturing throughput and automation in package assembly, wafer level testing of devices, and enhanced device performance. The method involves the use of a wafer-sized micro-cap array, which consists of an assortment of small caps micro-molded onto a material with adjustable shapes and sizes to serve as protective structures against the hostile environments during packaging. The micro-cap array is first constructed by a micromachining process with micro-molding technique, then sealed to the device wafer at wafer level. Epoxy-based wafer-level micro cap array has been successfully fabricated and showed good compatibility with conventional back-end packaging processes. An adhesive transfer technique was demonstrated to seal the micro cap array with a MEMS device wafer. No damage or gross leak was observed while wafer dicing or later during a gross leak test. Applications of the micro cap array are demonstrated on MEMS, microactuators fabricated using CRONOS MUMPS process. Depending on the application needs, the micro-molded cap can be designed and modified to facilitate additional component functions, such as optical, electrical, mechanical, and chemical functions, which are not easily achieved in the device by traditional means. Successful fabrication of a micro cap array comprised with microlenses can provide active functions as well as passive protection. An optical tweezer array could be one possibility for applications of a micro cap with microlenses. The micro cap itself could serve as micro well for DNA or bacteria amplification as well.
IRRADIATION METHOD AND APPARATUS
Cabell, C.P.
1962-12-18
A method and apparatus are described for changing fuel bodies into a process tube of a reactor. According to this method fresh fuel elements are introduced into one end of the tube forcing used fuel elements out the other end. When sufficient fuel has been discharged, a reel and tape arrangement is employed to pull the column of bodies back into the center of the tube. Due provision is made for providing shielding in the tube. (AEC)
Wang, Feng; Huisman, Jaco; Meskers, Christina E M; Schluep, Mathias; Stevels, Ab; Hagelüken, Christian
2012-11-01
E-waste is a complex waste category containing both hazardous and valuable substances. It demands for a cost-efficient treatment system which simultaneously liberates and refines target fractions in an environmentally sound way. In most developing countries there is a lack of systems covering all steps from disposal until final processing due to limited infrastructure and access to technologies and investment. This paper introduces the 'Best-of-2-Worlds' philosophy (Bo2W), which provides a network and pragmatic solution for e-waste treatment in emerging economies. It seeks technical and logistic integration of 'best' pre-processing in developing countries to manually dismantle e-waste and 'best' end-processing to treat hazardous and complex fractions in international state-of-the-art end-processing facilities. A series of dismantling trials was conducted on waste desktop computers, IT equipment, large and small household appliances, in order to compare the environmental and economic performances of the Bo2W philosophy with other conventional recycling scenarios. The assessment showed that the performance of the Bo2W scenario is more eco-efficient than mechanical separation scenarios and other local treatment solutions. For equipment containing substantial hazardous substances, it demands the assistance from domestic legislation for mandatory removal and safe handling of such fractions together with proper financing to cover the costs. Experience from Bo2W pilot projects in China and India highlighted key societal factors influencing successful implementation. These include market size, informal competitors, availability of national e-waste legislation, formal take-back systems, financing and trust between industrial players. The Bo2W philosophy can serve as a pragmatic and environmentally responsible transition before establishment of end-processing facilities in developing countries is made feasible. The executive models of Bo2W should be flexibly differentiated for various countries by adjusting to local conditions related to operational scale, level of centralized operations, dismantling depth, combination with mechanical processing and optimized logistics to international end-processors. Copyright © 2012 Elsevier Ltd. All rights reserved.
77 FR 26736 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-07
... an Internet Push methodology, in an effort to obtain early response rate indicators for the 2020... contact strategies involving optimizing the Internet push strategy are proposed, such as implementing... reducing and/or eliminating back-end processing. Affected Public: Individuals or households. Frequency: One...
NASA Astrophysics Data System (ADS)
Smith, A. D.; Vaziri, S.; Rodriguez, S.; Östling, M.; Lemme, M. C.
2015-06-01
A chip to wafer scale, CMOS compatible method of graphene device fabrication has been established, which can be integrated into the back end of the line (BEOL) of conventional semiconductor process flows. In this paper, we present experimental results of graphene field effect transistors (GFETs) which were fabricated using this wafer scalable method. The carrier mobilities in these transistors reach up to several hundred cm2 V-1 s-1. Further, these devices exhibit current saturation regions similar to graphene devices fabricated using mechanical exfoliation. The overall performance of the GFETs can not yet compete with record values reported for devices based on mechanically exfoliated material. Nevertheless, this large scale approach is an important step towards reliability and variability studies as well as optimization of device aspects such as electrical contacts and dielectric interfaces with statistically relevant numbers of devices. It is also an important milestone towards introducing graphene into wafer scale process lines.
Jia, Qi; den Dulk-Ras, Amke; Shen, Hexi; Hooykaas, Paul J J; de Pater, Sylvia
2013-07-01
Besides the KU-dependent classical non-homologous end-joining (C-NHEJ) pathway, an alternative NHEJ pathway first identified in mammalian systems, which is often called the back-up NHEJ (B-NHEJ) pathway, was also found in plants. In mammalian systems PARP was found to be one of the essential components in B-NHEJ. Here we investigated whether PARP1 and PARP2 were also involved in B-NHEJ in Arabidopsis. To this end Arabidopsis parp1, parp2 and parp1parp2 (p1p2) mutants were isolated and functionally characterized. The p1p2 double mutant was crossed with the C-NHEJ ku80 mutant resulting in the parp1parp2ku80 (p1p2k80) triple mutant. As expected, because of their role in single strand break repair (SSBR) and base excision repair (BER), the p1p2 and p1p2k80 mutants were shown to be sensitive to treatment with the DNA damaging agent MMS. End-joining assays in cell-free leaf protein extracts of the different mutants using linear DNA substrates with different ends reflecting a variety of double strand breaks were performed. The results showed that compatible 5'-overhangs were accurately joined in all mutants, that KU80 protected the ends preventing the formation of large deletions and that PARP proteins were involved in microhomology mediated end joining (MMEJ), one of the characteristics of B-NHEJ.
Research interface on a programmable ultrasound scanner.
Shamdasani, Vijay; Bae, Unmin; Sikdar, Siddhartha; Yoo, Yang Mo; Karadayi, Kerem; Managuli, Ravi; Kim, Yongmin
2008-07-01
Commercial ultrasound machines in the past did not provide the ultrasound researchers access to raw ultrasound data. Lack of this ability has impeded evaluation and clinical testing of novel ultrasound algorithms and applications. Recently, we developed a flexible ultrasound back-end where all the processing for the conventional ultrasound modes, such as B, M, color flow and spectral Doppler, was performed in software. The back-end has been incorporated into a commercial ultrasound machine, the Hitachi HiVision 5500. The goal of this work is to develop an ultrasound research interface on the back-end for acquiring raw ultrasound data from the machine. The research interface has been designed as a software module on the ultrasound back-end. To increase the amount of raw ultrasound data that can be spooled in the limited memory available on the back-end, we have developed a method that can losslessly compress the ultrasound data in real time. The raw ultrasound data could be obtained in any conventional ultrasound mode, including duplex and triplex modes. Furthermore, use of the research interface does not decrease the frame rate or otherwise affect the clinical usability of the machine. The lossless compression of the ultrasound data in real time can increase the amount of data spooled by approximately 2.3 times, thus allowing more than 6s of raw ultrasound data to be acquired in all the modes. The interface has been used not only for early testing of new ideas with in vitro data from phantoms, but also for acquiring in vivo data for fine-tuning ultrasound applications and conducting clinical studies. We present several examples of how newer ultrasound applications, such as elastography, vibration imaging and 3D imaging, have benefited from this research interface. Since the research interface is entirely implemented in software, it can be deployed on existing HiVision 5500 ultrasound machines and may be easily upgraded in the future. The developed research interface can aid researchers in the rapid testing and clinical evaluation of new ultrasound algorithms and applications. Additionally, we believe that our approach would be applicable to designing research interfaces on other ultrasound machines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zimmerman, T.
1997-12-01
This paper is distilled from a talk given at the 3rd International Meeting on Front End Electronics in Taos, N.M. on Nov. 7,1997. It is based on experience gained by designing and testing the SVX3 128 channel silicon strip detector readout chip. The SVX3 chip organization is shown in Fig. 1. The Front End section consists of an integrator and analog pipeline designed at Fermilab, and the Back End section is an ADC plus sparsification and readout logic designed at LBL. SVX3 is a deadtimeless readout chip, which means that the front end is acquiring low level analog signals whilemore » the back end is digitizing and reading out digital signals. It is thus a true mixed signal chip, and demands close attention to avoid disastrous coupling from the digital to the analog sections. SVX3 is designed in a bulk CMOS process (i.e., the circuits sit in a silicon substrate). In such a process, the substrate becomes a potential coupling path. This paper discusses the effect of the substrate resistivity on coupling, and also goes into a more general discussion of grounding and referencing in mixed signal designs and how low resistivity substrates can be used to advantage. Finally, an alternative power supply current conduction method for ASICs is presented as an additional advantage which can be obtained with low resistivity substrates. 1 ref., 13 figs., 1 tab.« less
CERN data services for LHC computing
NASA Astrophysics Data System (ADS)
Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.
2017-10-01
Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.
SPIDR, a general-purpose readout system for pixel ASICs
NASA Astrophysics Data System (ADS)
van der Heijden, B.; Visser, J.; van Beuzekom, M.; Boterenbrood, H.; Kulis, S.; Munneke, B.; Schreuder, F.
2017-02-01
The SPIDR (Speedy PIxel Detector Readout) system is a flexible general-purpose readout platform that can be easily adapted to test and characterize new and existing detector readout ASICs. It is originally designed for the readout of pixel ASICs from the Medipix/Timepix family, but other types of ASICs or front-end circuits can be read out as well. The SPIDR system consists of an FPGA board with memory and various communication interfaces, FPGA firmware, CPU subsystem and an API library on the PC . The FPGA firmware can be adapted to read out other ASICs by re-using IP blocks. The available IP blocks include a UDP packet builder, 1 and 10 Gigabit Ethernet MAC's and a "soft core" CPU . Currently the firmware is targeted at the Xilinx VC707 development board and at a custom board called Compact-SPIDR . The firmware can easily be ported to other Xilinx 7 series and ultra scale FPGAs. The gap between an ASIC and the data acquisition back-end is bridged by the SPIDR system. Using the high pin count VITA 57 FPGA Mezzanine Card (FMC) connector only a simple chip carrier PCB is required. A 1 and a 10 Gigabit Ethernet interface handle the connection to the back-end. These can be used simultaneously for high-speed data and configuration over separate channels. In addition to the FMC connector, configurable inputs and outputs are available for synchronization with other detectors. A high resolution (≈ 27 ps bin size) Time to Digital converter is provided for time stamping events in the detector. The SPIDR system is frequently used as readout for the Medipix3 and Timepix3 ASICs. Using the 10 Gigabit Ethernet interface it is possible to read out a single chip at full bandwidth or up to 12 chips at a reduced rate. Another recent application is the test-bed for the VeloPix ASIC, which is developed for the Vertex Detector of the LHCb experiment. In this case the SPIDR system processes the 20 Gbps scrambled data stream from the VeloPix and distributes it over four 10 Gigabit Ethernet links, and in addition provides the slow and fast control for the chip.
Scientific & Intelligence Exascale Visualization Analysis System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Money, James H.
SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.
The battle between Unix and Windows NT.
Anderson, H J
1997-02-01
For more than a decade, Unix has been the dominant back-end operating system in health care. But that prominent position is being challenged by Windows NT, touted by its developer, Microsoft Corp., as the operating system of the future. CIOs and others are attempting to figure out which system is the best choice in the long run.
NASA Astrophysics Data System (ADS)
Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.
2005-12-01
Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser and uses intuitive functionality, stakeholders with diverse degrees of technical insight can use this system with little or no training.
An alternative method of closed silicone intubation of the lacrimal system.
Henderson, P N; McNab, A A
1996-05-01
An alternative method of closed lacrimal intubation is described, the basis of which is to place the end of a piece of silicone tubing over the end of a small-diameter metal introducer, stretch the silicone tubing back along the introducer, and then pass the introducer together with the tubing through the lacrimal system into the nasal cavity. The tubing is visualized in the inferior meatus, from where it is retrieved, and then the introducer is withdrawn. The other end of the tubing is passed in a similar fashion. The technique is easily mastered, inexpensive, and less traumatic than other described techniques.
Uranium oxide fuel cycle analysis in VVER-1000 with VISTA simulation code
NASA Astrophysics Data System (ADS)
Mirekhtiary, Seyedeh Fatemeh; Abbasi, Akbar
2018-02-01
The VVER-1000 Nuclear power plant generates about 20-25 tons of spent fuel per year. In this research, the fuel transmutation of Uranium Oxide (UOX) fuel was calculated by using of nuclear fuel cycle simulation system (VISTA) code. In this simulation, we evaluated the back end components fuel cycle. The back end component calculations are Spent Fuel (SF), Actinide Inventory (AI) and Fission Product (FP) radioisotopes. The SF, AI and FP values were obtained 23.792178 ton/y, 22.811139 ton/y, 0.981039 ton/y, respectively. The obtained value of spent fuel, major actinide, and minor actinide and fission products were 23.8 ton/year, 22.795 ton/year, 0.024 ton/year and 0.981 ton/year, respectively.
Requirements Document for Development of a Livermore Tomography Tools Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seetho, I. M.
In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’smore » poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.« less
Electronic Data Interchange in Procurement
1990-04-01
contract management and order processing systems. This conversion of automated information to paper and back to automated form is not only slow and...automated purchasing computer and the contractor’s order processing computer through telephone lines, as illustrated in Figure 1-1. Computer-to-computer...into the contractor’s order processing or contract management system. This approach - converting automated information to paper and back to automated
A MoTe2 based light emitting diode and photodetector for silicon photonic integrated circuits
NASA Astrophysics Data System (ADS)
Bie, Ya-Qing; Heuck, M.; Grosso, G.; Furchi, M.; Cao, Y.; Zheng, J.; Navarro-Moratalla, E.; Zhou, L.; Taniguchi, T.; Watanabe, K.; Kong, J.; Englund, D.; Jarillo-Herrero, P.
A key challenge in photonics today is to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, partly because many components such as waveguides, interferometers and modulators, could be integrated on silicon-based processors. However, light sources and photodetectors present continued challenges. Common approaches for light source include off-chip or wafer-bonded lasers based on III-V materials, but studies show advantages for directly modulated light sources. The most advanced photodetectors in silicon photonics are based on germanium growth which increases system cost. The emerging two dimensional transition metal dichalcogenides (TMDs) offer a path for optical interconnects components that can be integrated with the CMOS processing by back-end-of-the-line processing steps. Here we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with infrared band gap. The state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.
Code of Federal Regulations, 2010 CFR
2010-07-01
... total organic HAP (or TOC, minus methane and ethane) emissions in all process vent streams and primary... TOC (minus methane and ethane) may be measured instead of total organic HAP. (C) The mass rates shall... and outlet of the control device shall be the sum of all total organic HAP (or TOC, minus methane and...
Spark gap switch system with condensable dielectric gas
Thayer, III, William J.
1991-01-01
A spark gap switch system is disclosed which is capable of operating at a high pulse rate comprising an insulated switch housing having a purging gas entrance port and a gas exit port, a pair of spaced apart electrodes each having one end thereof within the housing and defining a spark gap therebetween, an easily condensable and preferably low molecular weight insulating gas flowing through the switch housing from the housing, a heat exchanger/condenser for condensing the insulating gas after it exits from the housing, a pump for recirculating the condensed insulating gas as a liquid back to the housing, and a heater exchanger/evaporator to vaporize at least a portion of the condensed insulating gas back into a vapor prior to flowing the insulating gas back into the housing.
Development of management information system for land in mine area based on MapInfo
NASA Astrophysics Data System (ADS)
Wang, Shi-Dong; Liu, Chuang-Hua; Wang, Xin-Chuang; Pan, Yan-Yu
2008-10-01
MapInfo is current a popular GIS software. This paper introduces characters of MapInfo and GIS second development methods offered by MapInfo, which include three ones based on MapBasic, OLE automation, and MapX control usage respectively. Taking development of land management information system in mine area for example, in the paper, the method of developing GIS applications based on MapX has been discussed, as well as development of land management information system in mine area has been introduced in detail, including development environment, overall design, design and realization of every function module, and simple application of system, etc. The system uses MapX 5.0 and Visual Basic 6.0 as development platform, takes SQL Server 2005 as back-end database, and adopts Matlab 6.5 to calculate number in back-end. On the basis of integrated design, the system develops eight modules including start-up, layer control, spatial query, spatial analysis, data editing, application model, document management, results output. The system can be used in mine area for cadastral management, land use structure optimization, land reclamation, land evaluation, analysis and forecasting for land in mine area and environmental disruption, thematic mapping, and so on.
A graphene barristor using nitrogen profile controlled ZnO Schottky contacts.
Hwang, Hyeon Jun; Chang, Kyoung Eun; Yoo, Won Beom; Shim, Chang Hoo; Lee, Sang Kyung; Yang, Jin Ho; Kim, So-Young; Lee, Yongsu; Cho, Chunhum; Lee, Byoung Hun
2017-02-16
We have successfully demonstrated a graphene-ZnO:N Schottky barristor. The barrier height between graphene and ZnO:N could be modulated by a buried gate electrode in the range of 0.5-0.73 eV, and an on-off ratio of up to 10 7 was achieved. By using a nitrogen-doped ZnO film as a Schottky contact material, the stability problem of previously reported graphene barristors could be greatly alleviated and a facile route to build a top-down processed graphene barristor was realized with a very low heat cycle. This device will be instrumental when implementing logic functions in systems requiring high-performance logic devices fabricated with a low temperature fabrication process such as back-end integrated logic devices or flexible devices on soft substrates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norsworthy, R.
A rating system was developed for several coating types used for underground pipeline systems. Consideration included soil stress, adhesion, surface preparation, cathodic protection (CP) shielding, CP requirements, handling and construction, repair, field joint system, bends and other components, and the application process. Polyethylene- and polyvinyl chloride-backed tapes, woven polyolefin geotextile fabric (WGF)-backed tapes, hot-applied tapes, petrolatum- and wax-based tapes, and shrink sleeves were evaluated. WGF-backed tapes had the highest rating.
An Autonomic Framework for Integrating Security and Quality of Service Support in Databases
ERIC Educational Resources Information Center
Alomari, Firas
2013-01-01
The back-end databases of multi-tiered applications are a major data security concern for enterprises. The abundance of these systems and the emergence of new and different threats require multiple and overlapping security mechanisms. Therefore, providing multiple and diverse database intrusion detection and prevention systems (IDPS) is a critical…
Working with Pedagogical Agents: Understanding the "Back End" of an Intelligent Tutoring System
ERIC Educational Resources Information Center
Wolfe, Christopher; Widmer, Colin L.; Weil, Audrey M.; Cedillos-Whynott, Elizabeth M.
2015-01-01
Students in an undergraduate psychology course on Learning and Cognition used SKO (formerly AutoTutor Lite), an Intelligent Tutoring System, to create interactive lessons in which a pedagogic agent (animated avatar) engages users in a tutorial dialogue. After briefly describing the technology and underlying psychological theory, data from an…
Back-Arc Opening in the Western End of the Okinawa Trough Revealed From GNSS/Acoustic Measurements
NASA Astrophysics Data System (ADS)
Chen, Horng-Yue; Ikuta, Ryoya; Lin, Cheng-Horng; Hsu, Ya-Ju; Kohmi, Takeru; Wang, Chau-Chang; Yu, Shui-Beih; Tu, Yoko; Tsujii, Toshiaki; Ando, Masataka
2018-01-01
We measured seafloor movement using a Global Navigation Satellite Systems (GNSS)/Acoustic technique at the south of the rifting valley in the western end of the Okinawa Trough back-arc basin, 60 km east of northeastern corner of Taiwan. The horizontal position of the seafloor benchmark, measured eight times between July 2012 and May 2016, showed a southeastward movement suggesting a back-arc opening of the Okinawa Trough. The average velocity of the seafloor benchmark shows a block motion together with Yonaguni Island. The westernmost part of the Ryukyu Arc rotates clockwise and is pulled apart from the Taiwan Island, which should cause the expansion of the Yilan Plain, Taiwan. Comparing the motion of the seafloor benchmark with adjacent seismicity, we suggest a gentle episodic opening of the rifting valley accompanying a moderate seismic activation, which differs from the case in the segment north off-Yonaguni Island where a rapid dyke intrusion occurs with a significant seismic activity.
The eikonal function: the commom concept in ray optics and particle mechanics
NASA Astrophysics Data System (ADS)
Krautter, Martin
1993-04-01
The habit of teaching the movements of masses first, and propagation of light later, as an electromagnetic phenomenon was widespread. Looking further back into the history of physics, however, we see earlier the concepts for understanding light rays, and later their successful application to particle trajectories, leading to the highly developed celestial mechanics towards the end of the 19th century. And then, 1905, Karl Schwarzschild transferred the technique of `canonical coordinates,' named so by C.G.J. Jacobi in 1837, back to light rays in imaging systems. I would like to point to the chief steps in the evolution. The learning process for handling both particle and wave propagation aspects continues up to our time: Richard Feynman 1918 - 1988. We may judge each contribution: whether it opens our mind to a unifying theory, or whether it hardens partial understanding. And we can notice where the understanding of light propagation led the evolution, and how the theory for movement of masses caught up.
Kang, Jeeun; Yoon, Changhan; Lee, Jaejin; Kye, Sang-Bum; Lee, Yongbae; Chang, Jin Ho; Kim, Gi-Duck; Yoo, Yangmo; Song, Tai-kyong
2016-04-01
In this paper, we present a novel system-on-chip (SOC) solution for a portable ultrasound imaging system (PUS) for point-of-care applications. The PUS-SOC includes all of the signal processing modules (i.e., the transmit and dynamic receive beamformer modules, mid- and back-end processors, and color Doppler processors) as well as an efficient architecture for hardware-based imaging methods (e.g., dynamic delay calculation, multi-beamforming, and coded excitation and compression). The PUS-SOC was fabricated using a UMC 130-nm NAND process and has 16.8 GFLOPS of computing power with a total equivalent gate count of 12.1 million, which is comparable to a Pentium-4 CPU. The size and power consumption of the PUS-SOC are 27×27 mm(2) and 1.2 W, respectively. Based on the PUS-SOC, a prototype hand-held US imaging system was implemented. Phantom experiments demonstrated that the PUS-SOC can provide appropriate image quality for point-of-care applications with a compact PDA size ( 200×120×45 mm(3)) and 3 hours of battery life.
Detering, B.A.; Donaldson, A.D.; Fincke, J.R.; Kong, P.C.; Berry, R.A.
1999-08-10
A fast quench reaction includes a reactor chamber having a high temperature heating means such as a plasma torch at its inlet and a means of rapidly expanding a reactant stream, such as a restrictive convergent-divergent nozzle at its outlet end. Metal halide reactants are injected into the reactor chamber. Reducing gas is added at different stages in the process to form a desired end product and prevent back reactions. The resulting heated gaseous stream is then rapidly cooled by expansion of the gaseous stream. 8 figs.
Detering, Brent A.; Donaldson, Alan D.; Fincke, James R.; Kong, Peter C.; Berry, Ray A.
1999-01-01
A fast quench reaction includes a reactor chamber having a high temperature heating means such as a plasma torch at its inlet and a means of rapidly expanding a reactant stream, such as a restrictive convergent-divergent nozzle at its outlet end. Metal halide reactants are injected into the reactor chamber. Reducing gas is added at different stages in the process to form a desired end product and prevent back reactions. The resulting heated gaseous stream is then rapidly cooled by expansion of the gaseous stream.
Infrared thermography quantitative image processing
NASA Astrophysics Data System (ADS)
Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB
2017-11-01
Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.
Strategies for distant speech recognitionin reverberant environments
NASA Astrophysics Data System (ADS)
Delcroix, Marc; Yoshioka, Takuya; Ogawa, Atsunori; Kubo, Yotaro; Fujimoto, Masakiyo; Ito, Nobutaka; Kinoshita, Keisuke; Espi, Miquel; Araki, Shoko; Hori, Takaaki; Nakatani, Tomohiro
2015-12-01
Reverberation and noise are known to severely affect the automatic speech recognition (ASR) performance of speech recorded by distant microphones. Therefore, we must deal with reverberation if we are to realize high-performance hands-free speech recognition. In this paper, we review a recognition system that we developed at our laboratory to deal with reverberant speech. The system consists of a speech enhancement (SE) front-end that employs long-term linear prediction-based dereverberation followed by noise reduction. We combine our SE front-end with an ASR back-end that uses neural networks for acoustic and language modeling. The proposed system achieved top scores on the ASR task of the REVERB challenge. This paper describes the different technologies used in our system and presents detailed experimental results that justify our implementation choices and may provide hints for designing distant ASR systems.
BioMon: A Google Earth Based Continuous Biomass Monitoring System (Demo Paper)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju
2009-01-01
We demonstrate a Google Earth based novel visualization system for continuous monitoring of biomass at regional and global scales. This system is integrated with a back-end spatiotemporal data mining system that continuously detects changes using high temporal resolution MODIS images. In addition to the visualization, we demonstrate novel query features of the system that provides insights into the current conditions of the landscape.
Clinical terminology support for a national ambulatory practice outcomes research network.
Ricciardi, Thomas N; Lieberman, Michael I; Kahn, Michael G; Masarie, F E
2005-01-01
The Medical Quality Improvement Consortium (MQIC) is a nationwide collaboration of 74 healthcare delivery systems, consisting of 3755 clinicians, who contribute de-identified clinical data from the same commercial electronic medical record (EMR) for quality reporting, outcomes research and clinical research in public health and practice benchmarking. Despite the existence of a common, centrally-managed, shared terminology for core concepts (medications, problem lists, observation names), a substantial "back-end" information management process is required to ensure terminology and data harmonization for creating multi-facility clinically-acceptable queries and comparable results. We describe the information architecture created to support terminology harmonization across this data-sharing consortium and discuss the implications for large scale data sharing envisioned by proponents for the national adoption of ambulatory EMR systems.
An integrated dexterous robotic testbed for space applications
NASA Technical Reports Server (NTRS)
Li, Larry C.; Nguyen, Hai; Sauer, Edward
1992-01-01
An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.
ICE-Based Custom Full-Mesh Network for the CHIME High Bandwidth Radio Astronomy Correlator
NASA Astrophysics Data System (ADS)
Bandura, K.; Cliche, J. F.; Dobbs, M. A.; Gilbert, A. J.; Ittah, D.; Mena Parra, J.; Smecher, G.
2016-03-01
New generation radio interferometers encode signals from thousands of antenna feeds across large bandwidth. Channelizing and correlating this data requires networking capabilities that can handle unprecedented data rates with reasonable cost. The Canadian Hydrogen Intensity Mapping Experiment (CHIME) correlator processes 8-bits from N=2,048 digitizer inputs across 400MHz of bandwidth. Measured in N2× bandwidth, it is the largest radio correlator that is currently commissioning. Its digital back-end must exchange and reorganize the 6.6terabit/s produced by its 128 digitizing and channelizing nodes, and feed it to the 256 graphics processing unit (GPU) node spatial correlator in a way that each node obtains data from all digitizer inputs but across a small fraction of the bandwidth (i.e. ‘corner-turn’). In order to maximize performance and reliability of the corner-turn system while minimizing cost, a custom networking solution has been implemented. The system makes use of Field Programmable Gate Array (FPGA) transceivers to implement direct, passive copper, full-mesh, high speed serial connections between sixteen circuit boards in a crate, to exchange data between crates, and to offload the data to a cluster of 256 GPU nodes using standard 10Gbit/s Ethernet links. The GPU nodes complete the corner-turn by combining data from all crates and then computing visibilities. Eye diagrams and frame error counters confirm error-free operation of the corner-turn network in both the currently operating CHIME Pathfinder telescope (a prototype for the full CHIME telescope) and a representative fraction of the full CHIME hardware providing an end-to-end system validation. An analysis of an equivalent corner-turn system built with Ethernet switches instead of custom passive data links is provided.
Counter measures to effectively reduce end flare
NASA Astrophysics Data System (ADS)
Moneke, Matthias; Groche, Peter
2017-10-01
Roll forming is a manufacturing process, whose profitability is predicated on its high output. When roll formed profiles are cut to length, process related residual stresses are released and increased deformation at the profile ends at the cut-off occurs, also known as end flare. U-profiles typically show a flaring in at the lead end and a flaring out at the tail end. Due to this deformation, deviations from the dimensional accuracy can occur, which cause problems during further processing of the parts. Additional operations are necessary to compensate for the end flare, thereby increasing plant deployment time and production costs. Recent research focused on the cause of the residual stresses and it was shown, that a combination of residual longitudinal stresses and residual shear stresses are responsible for end flare. By exploiting this knowledge, it is possible to determine, depending on the flaring of the profile, in which part of the profile residual longitudinal or residual shear stresses are prevalent and which counter measures can specifically counteract the responsible residual stresses. For this purpose numerical and experimental investigations on a U-, Hat- and C-Profile were conducted. It could be shown that overbending and bending back of the profile is most effective in reducing end flare. Another developed method is lowering and elevating the profile to reduce residual longitudinal stresses.
Honarmand, Kavan; Minaskanian, Rafael; Maboudi, Seyed Ebrahim; Oskouei, Ali E
2018-01-01
[Purpose] Sitting position is the dominant position for a professional pianist. There are many static and dynamic forces which affect musculoskeletal system during sitting. In prolonged sitting, these forces are harmful. The aim of this study was to compare pianists' back extensor muscles activity during playing piano while sitting on a regular piano bench and a chair with back rest. [Subjects and Methods] Ten professional piano players (mean age 25.4 ± 5.28, 60% male, 40% female) performed similar tasks for 5 hours in two sessions: one session sitting on a regular piano bench and the other sitting on a chair with back rest. In each session, muscular activity was assessed in 3 ways: 1) recording surface electromyography of the back-extensor muscles at the beginning and end of each session, 2) isometric back extension test, and 3) musculoskeletal discomfort questionnaire. [Results] There were significantly lesser muscular activity, more ability to perform isometric back extension and better personal comfort while sitting on a chair with back rest. [Conclusion] Decreased muscular activity and perhaps fatigue during prolonged piano playing on a chair with back rest may reduce acquired musculoskeletal disorders amongst professional pianists.
Honarmand, Kavan; Minaskanian, Rafael; Maboudi, Seyed Ebrahim; Oskouei, Ali E.
2018-01-01
[Purpose] Sitting position is the dominant position for a professional pianist. There are many static and dynamic forces which affect musculoskeletal system during sitting. In prolonged sitting, these forces are harmful. The aim of this study was to compare pianists’ back extensor muscles activity during playing piano while sitting on a regular piano bench and a chair with back rest. [Subjects and Methods] Ten professional piano players (mean age 25.4 ± 5.28, 60% male, 40% female) performed similar tasks for 5 hours in two sessions: one session sitting on a regular piano bench and the other sitting on a chair with back rest. In each session, muscular activity was assessed in 3 ways: 1) recording surface electromyography of the back-extensor muscles at the beginning and end of each session, 2) isometric back extension test, and 3) musculoskeletal discomfort questionnaire. [Results] There were significantly lesser muscular activity, more ability to perform isometric back extension and better personal comfort while sitting on a chair with back rest. [Conclusion] Decreased muscular activity and perhaps fatigue during prolonged piano playing on a chair with back rest may reduce acquired musculoskeletal disorders amongst professional pianists. PMID:29410569
40 CFR 63.498 - Back-end process provisions-recordkeeping.
Code of Federal Regulations, 2011 CFR
2011-07-01
... be the crumb rubber dry weight of the rubber leaving the stripper. (iv) The organic HAP content of... be the crumb rubber dry weight of the crumb rubber leaving the stripper. (iii) The hourly average of... test runs. (1) The uncontrolled residual organic HAP content in the latex or dry crumb rubber, as...
40 CFR 63.498 - Back-end process provisions-recordkeeping.
Code of Federal Regulations, 2012 CFR
2012-07-01
... be the crumb rubber dry weight of the rubber leaving the stripper. (iv) The organic HAP content of... be the crumb rubber dry weight of the crumb rubber leaving the stripper. (iii) The hourly average of... test runs. (1) The uncontrolled residual organic HAP content in the latex or dry crumb rubber, as...
40 CFR 63.498 - Back-end process provisions-recordkeeping.
Code of Federal Regulations, 2014 CFR
2014-07-01
... be the crumb rubber dry weight of the rubber leaving the stripper. (iv) The organic HAP content of... be the crumb rubber dry weight of the crumb rubber leaving the stripper. (iii) The hourly average of... test runs. (1) The uncontrolled residual organic HAP content in the latex or dry crumb rubber, as...
40 CFR 63.498 - Back-end process provisions-recordkeeping.
Code of Federal Regulations, 2013 CFR
2013-07-01
... be the crumb rubber dry weight of the rubber leaving the stripper. (iv) The organic HAP content of... be the crumb rubber dry weight of the crumb rubber leaving the stripper. (iii) The hourly average of... test runs. (1) The uncontrolled residual organic HAP content in the latex or dry crumb rubber, as...
Design of overload vehicle monitoring and response system based on DSP
NASA Astrophysics Data System (ADS)
Yu, Yan; Liu, Yiheng; Zhao, Xuefeng
2014-03-01
The overload vehicles are making much more damage to the road surface than the regular ones. Many roads and bridges are equipped with structural health monitoring system (SHM) to provide early-warning to these damage and evaluate the safety of road and bridge. However, because of the complex nature of SHM system, it's expensive to manufacture, difficult to install and not well-suited for the regular bridges and roads. Based on this application background, this paper designs a compact structural health monitoring system based on DSP, which is highly integrated, low-power, easy to install and inexpensive to manufacture. The designed system is made up of sensor arrays, the charge amplifier module, the DSP processing unit, the alarm system for overload, and the estimate for damage of the road and bridge structure. The signals coming from sensor arrays go through the charge amplifier. DSP processing unit will receive the amplified signals, estimate whether it is an overload signal or not, and convert analog variables into digital ones so that they are compatible with the back-end digital circuit for further processing. The system will also restrict certain vehicles that are overweight, by taking image of the car brand, sending the alarm, and transferring the collected pressure data to remote data center for further monitoring analysis by rain-flow counting method.
Synchronization of chaotic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pecora, Louis M.; Carroll, Thomas L.
2015-09-15
We review some of the history and early work in the area of synchronization in chaotic systems. We start with our own discovery of the phenomenon, but go on to establish the historical timeline of this topic back to the earliest known paper. The topic of synchronization of chaotic systems has always been intriguing, since chaotic systems are known to resist synchronization because of their positive Lyapunov exponents. The convergence of the two systems to identical trajectories is a surprise. We show how people originally thought about this process and how the concept of synchronization changed over the years tomore » a more geometric view using synchronization manifolds. We also show that building synchronizing systems leads naturally to engineering more complex systems whose constituents are chaotic, but which can be tuned to output various chaotic signals. We finally end up at a topic that is still in very active exploration today and that is synchronization of dynamical systems in networks of oscillators.« less
From Good to Great: Creating a Fires-Centric VMU Culture
2011-04-07
Anderson 2 Conclusion: Solutions ranging from low to high impact are available. At the low impact end of the spectrum, the summer .2011 transition... back to the end of Predator’s advanced concept technology demonstration (ACTO) phase. During the Predator’s 30 month ACTO the Army was largely...that the Marine Corps is experiencing the same growing pains that the Air Force experienced with its weaponized unmanned aerial systems (UAS). Major
Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi
2015-08-01
Radio Frequency Identification (RFID) based solutions are widely used for providing many healthcare applications include patient monitoring, object traceability, drug administration system and telecare medicine information system (TMIS) etc. In order to reduce malpractices and ensure patient privacy, in 2015, Srivastava et al. proposed a hash based RFID tag authentication protocol in TMIS. Their protocol uses lightweight hash operation and synchronized secret value shared between back-end server and tag, which is more secure and efficient than other related RFID authentication protocols. Unfortunately, in this paper, we demonstrate that Srivastava et al.'s tag authentication protocol has a serious security problem in that an adversary may use the stolen/lost reader to connect to the medical back-end server that store information associated with tagged objects and this privacy damage causing the adversary could reveal medical data obtained from stolen/lost readers in a malicious way. Therefore, we propose a secure and efficient RFID tag authentication protocol to overcome security flaws and improve the system efficiency. Compared with Srivastava et al.'s protocol, the proposed protocol not only inherits the advantages of Srivastava et al.'s authentication protocol for TMIS but also provides better security with high system efficiency.
Launchable and Retrievable Tetherobot
NASA Technical Reports Server (NTRS)
Younse, Paulo; Aghazarian, Hrand
2010-01-01
A proposed robotic system for scientific exploration of rough terrain would include a stationary or infrequently moving larger base robot, to which would be tethered a smaller hopping robot of the type described in the immediately preceding article. The two-robot design would extend the reach of the base robot, making it possible to explore nearby locations that might otherwise be inaccessible or too hazardous for the base robot. The system would include a launching mechanism and a motor-driven reel on the larger robot. The outer end of the tether would be attached to the smaller robot; the inner end of the tether would be attached to the reel. The figure depicts the launching and retrieval process. The launching mechanism would aim and throw the smaller robot toward a target location, and the tether would be paid out from the reel as the hopping robot flew toward the target. Upon completion of exploratory activity at the target location, the smaller robot would be made to hop and, in a coordinated motion, the tether would be wound onto the reel to pull the smaller robot back to the larger one.
Micro-position sensor using faraday effect
McElfresh, Michael [Livermore, CA; Lucas, Matthew [Pittsburgh, PA; Silveira, Joseph P [Tracy, CA; Groves, Scott E [Brentwood, CA
2007-02-27
A micro-position sensor and sensing system using the Faraday Effect. The sensor uses a permanent magnet to provide a magnetic field, and a magneto-optic material positioned in the magnetic field for rotating the plane of polarization of polarized light transmitted through the magneto-optic material. The magnet is independently movable relative to the magneto-optic material so as to rotate the plane of polarization of the polarized light as a function of the relative position of the magnet. In this manner, the position of the magnet relative to the magneto-optic material may be determined from the rotated polarized light. The sensing system also includes a light source, such as a laser or LED, for producing polarized light, and an optical fiber which is connected to the light source and to the magneto-optic material at a sensing end of the optical fiber. Processing electronics, such as a polarimeter, are also provided for determining the Faraday rotation of the plane of polarization of the back-reflected polarized light to determine the position of the magnet relative to the sensing end of the optical fiber.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT.
Lavassani, Mehrzad; Forsström, Stefan; Jennehag, Ulf; Zhang, Tingting
2018-05-12
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications.
Combining Fog Computing with Sensor Mote Machine Learning for Industrial IoT
Lavassani, Mehrzad; Jennehag, Ulf; Zhang, Tingting
2018-01-01
Digitalization is a global trend becoming ever more important to our connected and sustainable society. This trend also affects industry where the Industrial Internet of Things is an important part, and there is a need to conserve spectrum as well as energy when communicating data to a fog or cloud back-end system. In this paper we investigate the benefits of fog computing by proposing a novel distributed learning model on the sensor device and simulating the data stream in the fog, instead of transmitting all raw sensor values to the cloud back-end. To save energy and to communicate as few packets as possible, the updated parameters of the learned model at the sensor device are communicated in longer time intervals to a fog computing system. The proposed framework is implemented and tested in a real world testbed in order to make quantitative measurements and evaluate the system. Our results show that the proposed model can achieve a 98% decrease in the number of packets sent over the wireless link, and the fog node can still simulate the data stream with an acceptable accuracy of 97%. We also observe an end-to-end delay of 180 ms in our proposed three-layer framework. Hence, the framework shows that a combination of fog and cloud computing with a distributed data modeling at the sensor device for wireless sensor networks can be beneficial for Industrial Internet of Things applications. PMID:29757227
Method and apparatus for air-coupled transducer
NASA Technical Reports Server (NTRS)
Song, Junho (Inventor); Chimenti, Dale E. (Inventor)
2010-01-01
An air-coupled transducer includes a ultrasonic transducer body having a radiation end with a backing fixture at the radiation end. There is a flexible backplate conformingly fit to the backing fixture and a thin membrane (preferably a metallized polymer) conformingly fit to the flexible backplate. In one embodiment, the backing fixture is spherically curved and the flexible backplate is spherically curved. The flexible backplate is preferably patterned with pits or depressions.
Apparatus and method for removing particulate deposits from high temperature filters
Nakaishi, Curtis V.; Holcombe, Norman T.; Micheli, Paul L.
1992-01-01
A combustion of a fuel-air mixture is used to provide a high-temperature and high-pressure pulse of gaseous combustion products for the back-flush cleaning of ceramic filter elements contained in a barrier filter system and utilized to separate particulates from particulate-laden process gases at high temperature and high pressure. The volume of gaseous combustion products provided by the combustion of the fuel-air mixture is preferably divided into a plurality of streams each passing through a sonic orifice and conveyed to the open end of each filter element as a high pressure pulse which passes through the filter elements and dislodges dust cake supported on a surface of the filter element.
Linear Back-Drive Differentials
NASA Technical Reports Server (NTRS)
Waydo, Peter
2003-01-01
Linear back-drive differentials have been proposed as alternatives to conventional gear differentials for applications in which there is only limited rotational motion (e.g., oscillation). The finite nature of the rotation makes it possible to optimize a linear back-drive differential in ways that would not be possible for gear differentials or other differentials that are required to be capable of unlimited rotation. As a result, relative to gear differentials, linear back-drive differentials could be more compact and less massive, could contain fewer complex parts, and could be less sensitive to variations in the viscosities of lubricants. Linear back-drive differentials would operate according to established principles of power ball screws and linear-motion drives, but would utilize these principles in an innovative way. One major characteristic of such mechanisms that would be exploited in linear back-drive differentials is the possibility of designing them to drive or back-drive with similar efficiency and energy input: in other words, such a mechanism can be designed so that a rotating screw can drive a nut linearly or the linear motion of the nut can cause the screw to rotate. A linear back-drive differential (see figure) would include two collinear shafts connected to two parts that are intended to engage in limited opposing rotations. The linear back-drive differential would also include a nut that would be free to translate along its axis but not to rotate. The inner surface of the nut would be right-hand threaded at one end and left-hand threaded at the opposite end to engage corresponding right- and left-handed threads on the shafts. A rotation and torque introduced into the system via one shaft would drive the nut in linear motion. The nut, in turn, would back-drive the other shaft, creating a reaction torque. Balls would reduce friction, making it possible for the shaft/nut coupling on each side to operate with 90 percent efficiency.
Indium-oxide nanoparticles for RRAM devices compatible with CMOS back-end-off-line
NASA Astrophysics Data System (ADS)
León Pérez, Edgar A. A.; Guenery, Pierre-Vincent; Abouzaid, Oumaïma; Ayadi, Khaled; Brottet, Solène; Moeyaert, Jérémy; Labau, Sébastien; Baron, Thierry; Blanchard, Nicholas; Baboux, Nicolas; Militaru, Liviu; Souifi, Abdelkader
2018-05-01
We report on the fabrication and characterization of Resistive Random Access Memory (RRAM) devices based on nanoparticles in MIM structures. Our approach is based on the use of indium oxide (In2O3) nanoparticles embedded in a dielectric matrix using CMOS-full-compatible fabrication processes in view of back-end-off-line integration for non-volatile memory (NVM) applications. A bipolar switching behavior has been observed using current-voltage measurements (I-V) for all devices. Very high ION/IOFF ratios have been obtained up to 108. Our results provide insights for further integration of In2O3 nanoparticles-based devices for NVM applications. He is currently a Postdoctoral Researcher in the Institute of Nanotechnologies of Lyon (INL), INSA de Lyon, France, in the Electronics Department. His current research include indium oxide nanoparticles for non-volatile memory applications, and the integrations of these devices in CMOS BEOL.
Design Aspects of the VLBI2010 System - Progress Report of the IVS VLBI2010 Committee
NASA Technical Reports Server (NTRS)
Petrachenko, Bill; Niell, Arthur; Behrend, Dirk; Corey, Brian; Boehm, Johannes; Chralot, Patrick; Collioud, Arnaud; Gipson, John; Haas, Ruediger; Hobiger, Thomas;
2009-01-01
This report summarizes the progress made in developing the next generation VLBI system, dubbed the VLBI2010 system. The VLBI2010 Committee of the International VLBI Service for Geodesy and Astrometry (IVS) worked on the design aspects of the new system. The report covers Monte Carlo simulations showing the impact of the new operating modes on the final products. A section on system considerations describes the implications for the VLBI2010 system parameters by considering the new modes and system-related issues such as sensitivity, antenna slew rate, delay measurement error. RF1, frequency requirements, antenna deformation, and source structure corrections_ This is followed by a description of all major subsystems and recommendations for the network, station. and antenna. Then aspects of the feed, polarization processing. calibration, digital back end, and correlator subsystems are covered. A section is dedicated to the NASA. proof-of-concept demonstration. Finally, sections tm operational considerations, on risks and fallback options, and on the next steps complete the report.
New instrumentation for the 1.2m Southern Millimeter Wave Telescope (SMWT)
NASA Astrophysics Data System (ADS)
Vasquez, P.; Astudillo, P.; Rodriguez, R.; Monasterio, D.; Reyes, N.; Finger, R.; Mena, F. P.; Bronfman, L.
2016-07-01
Here we describe the status of the upgrade program that is being performed to modernize the Southern 1.2m Wave Telescope. The Telescope was built during early ´80 to complete the first Galactic survey of Molecular Clouds in the CO(1-0) line. After a fruitful operation in CTIO the telescope was relocated to the Universidad de Chile, Cerro Calán Observatory. The new site has an altitude of 850m and allows observations in the millimeter range throughout the year. The telescope was upgraded, including a new building to house operations, new control system, and new receiver and back-end technologies. The new front end is a sideband-separating receiver based on a HEMT amplifier and sub-harmonic mixers. It is cooled with Liquid Nitrogen to diminish its noise temperature. The back-end is a digital spectrometer, based on the Reconfigurable Open Architecture Computing Hardware (ROACH). The new spectrometer includes IF hybridization capabilities to avoid analog hybrids and, therefore, improve the sideband rejection ratio of the receiver.
Electro-optical detector for use in a wide mass range mass spectrometer
NASA Technical Reports Server (NTRS)
Giffin, Charles E. (Inventor)
1976-01-01
An electro-optical detector is disclosed for use in a wide mass range mass spectrometer (MS), in the latter the focal plane is at or very near the exit end of the magnetic analyzer, so that a strong magnetic field of the order of 1000G or more is present at the focal plane location. The novel detector includes a microchannel electron multiplier array (MCA) which is positioned at the focal plane to convert ion beams which are focused by the MS at the focal plane into corresponding electron beams which are then accelerated to form visual images on a conductive phosphored surface. These visual images are then converted into images on the target of a vidicon camera or the like for electronic processing. Due to the strong magnetic field at the focal plane, in one embodiment of the invention, the MCA with front and back parallel ends is placed so that its front end forms an angle of not less than several degrees, preferably on the order of 10.degree.-20.degree., with respect to the focal plane, with the center line of the front end preferably located in the focal plane. In another embodiment the MCA is wedge-shaped, with its back end at an angle of about 10.degree.-20.degree. with respect to the front end. In this embodiment the MCA is placed so that its front end is located at the focal plane.
Global EOS: exploring the 300-ms-latency region
NASA Astrophysics Data System (ADS)
Mascetti, L.; Jericho, D.; Hsu, C.-Y.
2017-10-01
EOS, the CERN open-source distributed disk storage system, provides the highperformance storage solution for HEP analysis and the back-end for various work-flows. Recently EOS became the back-end of CERNBox, the cloud synchronisation service for CERN users. EOS can be used to take advantage of wide-area distributed installations: for the last few years CERN EOS uses a common deployment across two computer centres (Geneva-Meyrin and Budapest-Wigner) about 1,000 km apart (∼20-ms latency) with about 200 PB of disk (JBOD). In late 2015, the CERN-IT Storage group and AARNET (Australia) set-up a challenging R&D project: a single EOS instance between CERN and AARNET with more than 300ms latency (16,500 km apart). This paper will report about the success in deploy and run a distributed storage system between Europe (Geneva, Budapest), Australia (Melbourne) and later in Asia (ASGC Taipei), allowing different type of data placement and data access across these four sites.
The Need for Integrating the Back End of the Nuclear Fuel Cycle in the United States of America
Bonano, Evaristo J.; Kalinina, Elena A.; Swift, Peter N.
2018-02-26
Current practice for commercial spent nuclear fuel management in the United States of America (US) includes storage of spent fuel in both pools and dry storage cask systems at nuclear power plants. Most storage pools are filled to their operational capacity, and management of the approximately 2,200 metric tons of spent fuel newly discharged each year requires transferring older and cooler fuel from pools into dry storage. In the absence of a repository that can accept spent fuel for permanent disposal, projections indicate that the US will have approximately 134,000 metric tons of spent fuel in dry storage by mid-centurymore » when the last plants in the current reactor fleet are decommissioned. Current designs for storage systems rely on large dual-purpose (storage and transportation) canisters that are not optimized for disposal. Various options exist in the US for improving integration of management practices across the entire back end of the nuclear fuel cycle.« less
The Need for Integrating the Back End of the Nuclear Fuel Cycle in the United States of America
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonano, Evaristo J.; Kalinina, Elena A.; Swift, Peter N.
Current practice for commercial spent nuclear fuel management in the United States of America (US) includes storage of spent fuel in both pools and dry storage cask systems at nuclear power plants. Most storage pools are filled to their operational capacity, and management of the approximately 2,200 metric tons of spent fuel newly discharged each year requires transferring older and cooler fuel from pools into dry storage. In the absence of a repository that can accept spent fuel for permanent disposal, projections indicate that the US will have approximately 134,000 metric tons of spent fuel in dry storage by mid-centurymore » when the last plants in the current reactor fleet are decommissioned. Current designs for storage systems rely on large dual-purpose (storage and transportation) canisters that are not optimized for disposal. Various options exist in the US for improving integration of management practices across the entire back end of the nuclear fuel cycle.« less
Internal monitoring of GBTx emulator using IPbus for CBM experiment
NASA Astrophysics Data System (ADS)
Mandal, Swagata; Zabolotny, Wojciech; Sau, Suman; Chkrabarti, Amlan; Saini, Jogender; Chattopadhyay, Subhasis; Pal, Sushanta Kumar
2015-09-01
The Compressed Baryonic Matter (CBM) experiment is a part of the Facility for Antiproton and Ion Research (FAIR) in Darmstadt at GSI. In CBM experiment a precisely time synchronized fault tolerant self-triggered electronics is required for Data Acquisition (DAQ) system in CBM experiments which can support high data rate (up to several TB/s). As a part of the implementation of the DAQ system of Muon Chamber (MUCH) which is one of the important detectors in CBM experiment, a FPGA based Gigabit Transceiver (GBTx) emulator is implemented. Readout chain for MUCH consists of XYTER chips (Front end electronics) which will be directly connected to detector, GBTx emulator, Data Processing Board (DPB) and First level event selector board (FLIB) with backend software interface. GBTx emulator will be connected with the XYTER emulator through LVDS (Low Voltage Differential Signalling) line in the front end and in the back end it is connected with DPB through 4.8 Gbps optical link. IPBus over Ethernet is used for internal monitoring of the registers within the GBTx. In IPbus implementation User Datagram Protocol (UDP) stack is used in transport layer of OSI model so that GBTx can be controlled remotely. A Python script is used at computer side to drive IPbus controller.
NASA Astrophysics Data System (ADS)
Hazza, Muataz Hazza F. Al; Adesta, Erry Y. T.; Riza, Muhammad
2013-12-01
High speed milling has many advantages such as higher removal rate and high productivity. However, higher cutting speed increase the flank wear rate and thus reducing the cutting tool life. Therefore estimating and predicting the flank wear length in early stages reduces the risk of unaccepted tooling cost. This research presents a neural network model for predicting and simulating the flank wear in the CNC end milling process. A set of sparse experimental data for finish end milling on AISI H13 at hardness of 48 HRC have been conducted to measure the flank wear length. Then the measured data have been used to train the developed neural network model. Artificial neural network (ANN) was applied to predict the flank wear length. The neural network contains twenty hidden layer with feed forward back propagation hierarchical. The neural network has been designed with MATLAB Neural Network Toolbox. The results show a high correlation between the predicted and the observed flank wear which indicates the validity of the models.
Non-Markovian quantum feedback networks II: Controlled flows
NASA Astrophysics Data System (ADS)
Gough, John E.
2017-06-01
The concept of a controlled flow of a dynamical system, especially when the controlling process feeds information back about the system, is of central importance in control engineering. In this paper, we build on the ideas presented by Bouten and van Handel [Quantum Stochastics and Information: Statistics, Filtering and Control (World Scientific, 2008)] and develop a general theory of quantum feedback. We elucidate the relationship between the controlling processes, Z, and the measured processes, Y, and to this end we make a distinction between what we call the input picture and the output picture. We should note that the input-output relations for the noise fields have additional terms not present in the standard theory but that the relationship between the control processes and measured processes themselves is internally consistent—we do this for the two main cases of quadrature measurement and photon-counting measurement. The theory is general enough to include a modulating filter which post-processes the measurement readout Y before returning to the system. This opens up the prospect of applying very general engineering feedback control techniques to open quantum systems in a systematic manner, and we consider a number of specific modulating filter problems. Finally, we give a brief argument as to why most of the rules for making instantaneous feedback connections [J. Gough and M. R. James, Commun. Math. Phys. 287, 1109 (2009)] ought to apply for controlled dynamical networks as well.
Nursing process decision support system for urology ward.
Hao, Angelica Te-Hui; Wu, Lee-Pin; Kumar, Ajit; Jian, Wen-Shan; Huang, Li-Fang; Kao, Ching-Chiu; Hsu, Chien-Yeh
2013-07-01
We developed a nursing process decision support system (NPDSS) based on three clinical pathways, including benign prostatic hypertrophy, inguinal hernia, and urinary tract stone. NPDSS included six major nursing diagnoses - acute pain, impaired urinary elimination, impaired skin integrity, anxiety, infection risk, and risk of falling. This paper aims to describe the design, development and validation process of the NPDSS. We deployed the Delphi method to reach consensus for decision support rules of NPDSS. A team of nine-member expert nurses from a medical center in Taiwan was involved in Delphi method. The Cronbach's α method was used for examining the reliability of the questionnaire used in the Delphi method. The Visual Basic 6.0 as front-end and Microsoft Access 2003 as back-end was used to develop the system. A team of six nursing experts was asked to evaluate the usability of the developed systems. A 5-point Likert scale questionnaire was used for the evaluation. The sensitivity and specificity of NPDSS were validated using 150 nursing chart. The study showed a consistency between the diagnoses of the developed system (NPDSS) and the nursing charts. The sensitivities of the nursing diagnoses including acute pain, impaired urinary elimination, risk of infection, and risk of falling were 96.9%, 98.1%, 94.9%, and 89.9% respectively; and the specificities were 88%, 49.5%, 62%, and 88% respectively. We did not calculate the sensitivity and specificity of impaired skin integrity and anxiety due to non-availability of enough sample size. NPDSS can help nurses in decision making of nursing diagnoses. Besides, it can help them to generate nursing diagnoses based on patient-specific data, individualized care plans, and implementation within their usual nursing workflow. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DAPHNE silicon photonics technological platform for research and development on WDM applications
NASA Astrophysics Data System (ADS)
Baudot, Charles; Fincato, Antonio; Fowler, Daivid; Perez-Galacho, Diego; Souhaité, Aurélie; Messaoudène, Sonia; Blanc, Romuald; Richard, Claire; Planchot, Jonathan; De-Buttet, Come; Orlando, Bastien; Gays, Fabien; Mezzomo, Cécilia; Bernard, Emilie; Marris-Morini, Delphine; Vivien, Laurent; Kopp, Christophe; Boeuf, Frédéric
2016-05-01
A new technological platform aimed at making prototypes and feasibility studies has been setup at STMicroelectronics using 300mm wafer foundry facilities. The technology, called DAPHNE (Datacom Advanced PHotonic Nanoscale Environment), is devoted at developing and evaluating new devices and sub-systems in particular for wavelength division multiplexing (WDM) applications and ring resonator based applications. Developed in the course of PLAT4MFP7 European project, DAPHNE is a flexible platform that fits perfectly R&D needs. The fabrication flow enables the processing of photonic integrated circuits using a silicon-on-insulator (SOI) of 300nm, partial etches of 150nm and 50nm and a total silicon etching. Consequently, two varieties of rib waveguides and one strip waveguide can be fabricated simultaneously with auto-alignment properties. The process variability on the 150nm partially etched silicon and the thin 50nm slab region are both less than 6 nm. Using a variety of different implantation configurations and a back-end of line of 5 metal layers, active devices are fabricated both in germanium and silicon. An available far back-end of line process consists of making 20 μm diameter copper posts on top of the electrical pads so that an electronic integrated circuit can be bonded on top the photonic die by 3D integration. Besides having those fabrication process options, DAPHNE is equipped with a library of standard cells for optical routing and multiplexing. Moreover, typical Mach-Zehnder modulators based on silicon pn junctions are also available for optical signal modulation. To achieve signal detection, germanium photodetectors also exist as standard cells. The measured single-mode propagation losses are 3.5 dB/cm for strip, 3.7 dB/cm for deep-rib (50nm slab) and 1.4 dB/cm for standard rib (150nm slab) waveguides. Transition tapers between different waveguide structures are as low as 0.006 dB.
Motta, Mario; Zhang, Shiwei
2017-11-14
We address the computation of ground-state properties of chemical systems and realistic materials within the auxiliary-field quantum Monte Carlo method. The phase constraint to control the Fermion phase problem requires the random walks in Slater determinant space to be open-ended with branching. This in turn makes it necessary to use back-propagation (BP) to compute averages and correlation functions of operators that do not commute with the Hamiltonian. Several BP schemes are investigated, and their optimization with respect to the phaseless constraint is considered. We propose a modified BP method for the computation of observables in electronic systems, discuss its numerical stability and computational complexity, and assess its performance by computing ground-state properties in several molecular systems, including small organic molecules.
Code of Federal Regulations, 2013 CFR
2013-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
Code of Federal Regulations, 2014 CFR
2014-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... thermocouple, ultra-violet beam sensor, or infrared sensor) capable of continuously detecting the presence of a..., as appropriate. (1) Where an incinerator is used, a temperature monitoring device equipped with a... temperature monitoring device shall be installed in the firebox or in the ductwork immediately downstream of...
National supply-chain survey of drug manufacturer back orders.
Wellman, G S
2001-07-01
The impact of manufacturer back orders on the supply chain for pharmaceuticals in the institutional setting was studied. A questionnaire was distributed during May and June 2000 to 600 institutional pharmacies affiliated with a major national drug and supply group purchasing organization. The instrument included questions on basic institutional demographics, perceptions about the frequency of manufacturer back orders for pharmaceuticals, the quality of communication with manufacturers and wholesalers about back orders, the two most significant back orders that had occurred in the 12 months preceding the survey, and the reasons for and impact of back orders. A total of 170 usable surveys were returned (net response rate, 28.3%). Reported manufacturer back orders included an array of drug classes, including blood products, antimicrobials, antiarrhythmics, benzodiazepine antagonists, thrombolytics, corticosteroids, and antihypertensives. Respondents perceived significant back orders as increasing in frequency. Communication by manufacturers and wholesalers about back orders was reported to be relatively poor. A raw-material shortage was the most common reason given by manufacturers for back orders (36.5%), followed by a regulatory issue (23.2%). In most cases (92%), medical staff members had to be contacted, indicating an interruption in the normal drug distribution process. In over a third of instances, respondents stated that the back order resulted in less optimal therapy. A survey found that manufacturer back orders for pharmaceuticals were increasing in frequency and that information flow within the supply chain was insufficient to meet the needs of end users.
In vivo electrode implanting system
NASA Technical Reports Server (NTRS)
Collins, Jr., Earl R. (Inventor)
1989-01-01
A cylindrical intramuscular implantable electrode is provided with a strip of fabric secured around it. The fabric is woven from a polyester fiber having loops of the fiber protruding. The end of the main cylindrical body is provided with a blunt conductive nose, and the opposite end is provided with a smaller diameter rear section with an annular groove to receive tips of fingers extending from a release tube. The fingers are formed to spring outwardly and move the fingertips out of the annular groove in order to release the electrode from the release tube when a sheath over the electrode is drawn back sufficiently. The sheath compresses the fingers of the release tube and the fabric loops until it is drawn back. Muscle tissue grows into the loops to secure the electrode in place after the sheath is drawn back. The entire assembly of electrode, release tube and sheath can be inserted into the patient's muscle to the desired position through a hypodermic needle. The release tube may be used to manipulate the electrode in the patient's muscle to an optimum position before the electrode is released.
Integrated seat frame and back support
Martin, Leo
1999-01-01
An integrated seating device comprises a seat frame having a front end and a rear end. The seat frame has a double wall defining an exterior wall and an interior wall. The rear end of the seat frame has a slot cut therethrough both the exterior wall and the interior wall. The front end of the seat frame has a slot cut through just the interior wall thereof. A back support comprising a generally L shape has a horizontal member, and a generally vertical member which is substantially perpendicular to the horizontal member. The horizontal member is sized to be threaded through the rear slot and is fitted into the front slot. Welded slat means secures the back support to the seat frame to result in an integrated seating device.
Sampling Error in a Particulate Mixture: An Analytical Chemistry Experiment.
ERIC Educational Resources Information Center
Kratochvil, Byron
1980-01-01
Presents an undergraduate experiment demonstrating sampling error. Selected as the sampling system is a mixture of potassium hydrogen phthalate and sucrose; using a self-zeroing, automatically refillable buret to minimize titration time of multiple samples and employing a dilute back-titrant to obtain high end-point precision. (CS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
BERG, MICHAEL; RILEY, MARSHALL
System assessments typically yield large quantities of data from disparate sources for an analyst to scrutinize for issues. Netmeld is used to parse input from different file formats, store the data in a common format, allow users to easily query it, and enable analysts to tie different analysis tools together using a common back-end.
VizieR Online Data Catalog: Molecular clumps in W51 giant molecular cloud (Parsons+, 2012)
NASA Astrophysics Data System (ADS)
Parsons, H.; Thompson, M. A.; Clark, J. S.; Chrysostomou, A.
2013-04-01
The W51 GMC was mapped using the Heterodyne Array Receiver Programme (HARP) receiver with the back-end digital autocorrelator spectrometer Auto-Correlation Spectral Imaging System (ACSIS) on the James Clerk Maxwell Telescope (JCMT). Data were taken in 2008 May. (2 data files).
LWS/SET End-to-End Data System
NASA Technical Reports Server (NTRS)
Giffin, Geoff; Sherman, Barry; Colon, Gilberto (Technical Monitor)
2002-01-01
This paper describes the concept for the End-to-End Data System that will support NASA's Living With a Star Space Environment Testbed missions. NASA has initiated the Living With a Star (LWS) Program to develop a better scientific understanding to address the aspects of the connected Sun-Earth system that affect life and society. A principal goal of the program is to bridge the gap.between science, engineering, and user application communities. The Space Environment Testbed (SET) Project is one element of LWS. The Project will enable future science, operational, and commercial objectives in space and atmospheric environments by improving engineering approaches to the accommodation and/or mitigation of the effects of solar variability on technological systems. The End-to-end data system allows investigators to access the SET control center, command their experiments, and receive data from their experiments back at their home facility, using the Internet. The logical functioning of major components of the end-to-end data system are described, including the GSFC Payload Operations Control Center (POCC), SET Payloads, the GSFC SET Simulation Lab, SET Experiment PI Facilities, and Host Systems. Host Spacecraft Operations Control Centers (SOCC) and the Host Spacecraft are essential links in the end-to-end data system, but are not directly under the control of the SET Project. Formal interfaces will be established between these entities and elements of the SET Project. The paper describes data flow through the system, from PI facilities connecting to the SET operations center via the Internet, communications to SET carriers and experiments via host systems, to telemetry returns to investigators from their flight experiments. It also outlines the techniques that will be used to meet mission requirements, while holding development and operational costs to a minimum. Additional information is included in the original extended abstract.
Hoffman, Shannon L.; Johnson, Molly B.; Zou, Dequan; Van Dillen, Linda R.
2012-01-01
Patterns of lumbar posture and motion are associated with low back pain (LBP). Research suggests LBP subgroups demonstrate different patterns during common tasks. This study assessed differences in end-range lumbar flexion during two tasks between two LBP subgroups classified according to the Movement System Impairment model. Additionally, the impact of gender differences on subgroup differences was assessed. Kinematic data were collected. Subjects in the Rotation (Rot) and Rotation with Extension (RotExt) LBP subgroups were asked to sit slumped and bend forward from standing. Lumbar end-range flexion was calculated. Subjects reported symptom behavior during each test. Compared to the RotExt subgroup, the Rot subgroup demonstrated greater end-range lumbar flexion during slumped sitting and a trend towards greater end-range lumbar flexion with forward bending. Compared to females, males demonstrated greater end-range lumbar flexion during slumped sitting and forward bending. A greater proportion of people in the Rot subgroup reported symptoms with each test compared to the RotExt subgroup. Males and females were equally likely to report symptoms with each test. Gender differences were not responsible for LBP subgroup differences. Subgrouping people with LBP provides insight into differences in lumbar motion within the LBP population. Results suggesting potential consistent differences across flexion-related tasks support the presence of stereotypical movement patterns that are related to LBP. PMID:22261650
Using business analytics to improve outcomes.
Rivera, Jose; Delaney, Stephen
2015-02-01
Orlando Health has brought its hospital and physician practice revenue cycle systems into better balance using four sets of customized analytics: Physician performance analytics gauge the total net revenue for every employed physician. Patient-pay analytics provide financial risk scores for all patients on both the hospital and physician practice sides. Revenue management analytics bridge the gap between the back-end central business office and front-end physician practice managers and administrators. Enterprise management analytics allow the hospitals and physician practices to share important information about common patients.
Pharmacists' views on implementing a disease state management program for low back pain.
Abdel Shaheed, Christina; Maher, Christopher G; Williams, Kylie A; McLachlan, Andrew J
2016-01-01
Pharmacists have the potential to take a lead role in the primary care management of people with acute low back pain. The aim of this study was to investigate pharmacists' views on implementing a care program for people with acute low back pain in the community pharmacy. Recruitment of pharmacists for this study took place between July 2012 and March 2013. A convenience sample of 30 pharmacists who collaborated in recruiting participants for a low back pain clinical trial in Sydney (n=15 pharmacist recruiters and n=15 non-recruiters) completed an open-ended questionnaire. There was no marked variation in responses between the two groups. Participating pharmacists were receptive to the idea of implementing a care program for people with low back pain, highlighting the need for adequate reimbursement and adequate training of staff to ensure it is successful. Pharmacists identified that the follow up of people receiving such a service is dependent on several factors such as effective reminder systems and the proximity of patients to the pharmacy.
On-demand Simulation of Atmospheric Transport Processes on the AlpEnDAC Cloud
NASA Astrophysics Data System (ADS)
Hachinger, S.; Harsch, C.; Meyer-Arnek, J.; Frank, A.; Heller, H.; Giemsa, E.
2016-12-01
The "Alpine Environmental Data Analysis Centre" (AlpEnDAC) develops a data-analysis platform for high-altitude research facilities within the "Virtual Alpine Observatory" project (VAO). This platform, with its web portal, will support use cases going much beyond data management: On user request, the data are augmented with "on-demand" simulation results, such as air-parcel trajectories for tracing down the source of pollutants when they appear in high concentration. The respective back-end mechanism uses the Compute Cloud of the Leibniz Supercomputing Centre (LRZ) to transparently calculate results requested by the user, as far as they have not yet been stored in AlpEnDAC. The queuing-system operation model common in supercomputing is replaced by a model in which Virtual Machines (VMs) on the cloud are automatically created/destroyed, providing the necessary computing power immediately on demand. From a security point of view, this allows to perform simulations in a sandbox defined by the VM configuration, without direct access to a computing cluster. Within few minutes, the user receives conveniently visualized results. The AlpEnDAC infrastructure is distributed among two participating institutes [front-end at German Aerospace Centre (DLR), simulation back-end at LRZ], requiring an efficient mechanism for synchronization of measured and augmented data. We discuss our iRODS-based solution for these data-management tasks as well as the general AlpEnDAC framework. Our cloud-based offerings aim at making scientific computing for our users much more convenient and flexible than it has been, and to allow scientists without a broad background in scientific computing to benefit from complex numerical simulations.
High Temporal Resolution Mapping of Seismic Noise Sources Using Heterogeneous Supercomputers
NASA Astrophysics Data System (ADS)
Paitz, P.; Gokhberg, A.; Ermert, L. A.; Fichtner, A.
2017-12-01
The time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems like earthquake fault zones, volcanoes, geothermal and hydrocarbon reservoirs. We present results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service providing seismic noise source maps for Central Europe with high temporal resolution. We use source imaging methods based on the cross-correlation of seismic noise records from all seismic stations available in the region of interest. The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept to provide the interested researchers worldwide with regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for the generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise source mapping itself rests on the measurement of logarithmic amplitude ratios in suitably pre-processed noise correlations, and the use of simplified sensitivity kernels. During the implementation we addressed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service-oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
A new data acquisition system for the CMS Phase 1 pixel detector
NASA Astrophysics Data System (ADS)
Kornmayer, A.
2016-12-01
A new pixel detector will be installed in the CMS experiment during the extended technical stop of the LHC at the beginning of 2017. The new pixel detector, built from four layers in the barrel region and three layers on each end of the forward region, is equipped with upgraded front-end readout electronics, specifically designed to handle the high particle hit rates created in the LHC environment. The DAQ back-end was entirely redesigned to handle the increased number of readout channels, the higher data rates per channel and the new digital data format. Based entirely on the microTCA standard, new front-end controller (FEC) and front-end driver (FED) cards have been developed, prototyped and produced with custom optical link mezzanines mounted on the FC7 AMC and custom firmware. At the same time as the new detector is being assembled, the DAQ system is set up and its integration into the CMS central DAQ system tested by running the pilot blade detector already installed in CMS. This work describes the DAQ system, integration tests and gives an outline for the activities up to commissioning the final system at CMS in 2017.
Bennett, Charles L.
2010-06-15
A solar thermal power generator includes an inclined elongated boiler tube positioned in the focus of a solar concentrator for generating steam from water. The boiler tube is connected at one end to receive water from a pressure vessel as well as connected at an opposite end to return steam back to the vessel in a fluidic circuit arrangement that stores energy in the form of heated water in the pressure vessel. An expander, condenser, and reservoir are also connected in series to respectively produce work using the steam passed either directly (above a water line in the vessel) or indirectly (below a water line in the vessel) through the pressure vessel, condense the expanded steam, and collect the condensed water. The reservoir also supplies the collected water back to the pressure vessel at the end of a diurnal cycle when the vessel is sufficiently depressurized, so that the system is reset to repeat the cycle the following day. The circuital arrangement of the boiler tube and the pressure vessel operates to dampen flow instabilities in the boiler tube, damp out the effects of solar transients, and provide thermal energy storage which enables time shifting of power generation to better align with the higher demand for energy during peak energy usage periods.
Botelho, Anabela; Ferreira Dias, Marta; Ferreira, Carla; Pinto, Lígia M Costa
2016-10-01
This paper aims to ascertain the efficacy and acceptability of five incentive schemes for the take-back of waste electrical and electronic equipment in Portugal, focusing in consumers' perspectives. It assesses users' perception of these items, evaluating the motivations and interests they have concerning the market of waste electrical and electronic equipment. Results indicate, on one hand, a lack of awareness by consumers about the process of take-back of their equipment. On the other hand, results show that information conditions and socio-demographic factors affect consumers' motivations for returning the electrical and electronic equipment at the end of life. In this context, it can be concluded that, in Portugal, the market for the recovery of waste electrical and electronic equipment is still in its infancy. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Sanford, James L.; Schlig, Eugene S.; Prache, Olivier; Dove, Derek B.; Ali, Tariq A.; Howard, Webster E.
2002-02-01
The IBM Research Division and eMagin Corp. jointly have developed a low-power VGA direct view active matrix OLED display, fabricated on a crystalline silicon CMOS chip. The display is incorporated in IBM prototype wristwatch computers running the Linus operating system. IBM designed the silicon chip and eMagin developed the organic stack and performed the back-end-of line processing and packaging. Each pixel is driven by a constant current source controlled by a CMOS RAM cell, and the display receives its data from the processor memory bus. This paper describes the OLED technology and packaging, and outlines the design of the pixel and display electronics and the processor interface. Experimental results are presented.
Motivated to Retrieve: How Often Are You Willing to Go Back to the Well when the Well Is Dry?
ERIC Educational Resources Information Center
Dougherty, Michael R.; Harbison, J. Isaiah
2007-01-01
Despite the necessity of the decision to terminate memory search in many real-world memory tasks, little experimental work has investigated the underlying processes. In this study, the authors investigated termination decisions in free recall by providing participants an open-ended retrieval interval and requiring them to press a stop button when…
Using Neural Net Technology To Enhance the Efficiency of a Computer Adaptive Testing Application.
ERIC Educational Resources Information Center
Van Nelson, C.; Henriksen, Larry W.
The potential for computer adaptive testing (CAT) has been well documented. In order to improve the efficiency of this process, it may be possible to utilize a neural network, or more specifically, a back propagation neural network. The paper asserts that in order to accomplish this end, it must be shown that grouping examinees by ability as…
Code of Federal Regulations, 2010 CFR
2010-07-01
... is to be determined using the Methods specified in paragraph (e) of this section. (4) The quantity of... methods of determining this quantity are production records, measurement of stream characteristics, and... paragraph (d)(1)(i), (d)(1)(ii), or (d)(1)(iii) of this section. (i) When the latex is not blended with...
40 CFR 63.494 - Back-end process provisions-residual organic HAP and emission limitations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... be measured after the stripping operation (or the reactor(s), if the plant has no stripper(s)), as... operation (or the reactor(s), if the plant has no stripper(s)). The limitation shall be calculated and... = Controlled emissions in 2009, Mg/yr P2009 = Total elastomer product leaving the stripper in 2009, Mg/yr...
40 CFR 63.494 - Back-end process provisions-residual organic HAP and emission limitations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... be measured after the stripping operation (or the reactor(s), if the plant has no stripper(s)), as... operation (or the reactor(s), if the plant has no stripper(s)). The limitation shall be calculated and... = Controlled emissions in 2009, Mg/yr P2009 = Total elastomer product leaving the stripper in 2009, Mg/yr...
A MEMS-based, wireless, biometric-like security system
NASA Astrophysics Data System (ADS)
Cross, Joshua D.; Schneiter, John L.; Leiby, Grant A.; McCarter, Steven; Smith, Jeremiah; Budka, Thomas P.
2010-04-01
We present a system for secure identification applications that is based upon biometric-like MEMS chips. The MEMS chips have unique frequency signatures resulting from fabrication process variations. The MEMS chips possess something analogous to a "voiceprint". The chips are vacuum encapsulated, rugged, and suitable for low-cost, highvolume mass production. Furthermore, the fabrication process is fully integrated with standard CMOS fabrication methods. One is able to operate the MEMS-based identification system similarly to a conventional RFID system: the reader (essentially a custom network analyzer) detects the power reflected across a frequency spectrum from a MEMS chip in its vicinity. We demonstrate prototype "tags" - MEMS chips placed on a credit card-like substrate - to show how the system could be used in standard identification or authentication applications. We have integrated power scavenging to provide DC bias for the MEMS chips through the use of a 915 MHz source in the reader and a RF-DC conversion circuit on the tag. The system enables a high level of protection against typical RFID hacking attacks. There is no need for signal encryption, so back-end infrastructure is minimal. We believe this system would make a viable low-cost, high-security system for a variety of identification and authentication applications.
Transactional interactive multimedia banner
NASA Astrophysics Data System (ADS)
Shae, Zon-Yin; Wang, Xiping; von Kaenel, Juerg
2000-05-01
Advertising in TV broadcasting has shown that multimedia is a very effective means to present merchandise and attract shoppers. This has been applied to the Web by including animated multimedia banner ads on web pages. However, the issues of coupling interactive browsing, shopping, and secure transactions e.g. from inside a multimedia banner, have only recently started to being explored. Currently there is an explosively growing amount of back-end services available (e.g., business to business commerce (B2B), business to consumer (B2C) commerce, and infomercial services) in the Internet. These services are mostly accessible through static HTML web pages at a few specific web portals. In this paper, we will investigate the feasibility of using interactive multimedia banners as pervasive access point for the B2C, B2B, and infomercial services. We present a system architecture that involves a layer of middleware agents functioning as the bridge between the interactive multimedia banners and back-end services.
Experience with Multi-Tier Grid MySQL Database Service Resiliency at BNL
NASA Astrophysics Data System (ADS)
Wlodek, Tomasz; Ernst, Michael; Hover, John; Katramatos, Dimitrios; Packard, Jay; Smirnov, Yuri; Yu, Dantong
2011-12-01
We describe the use of F5's BIG-IP smart switch technology (3600 Series and Local Traffic Manager v9.0) to provide load balancing and automatic fail-over to multiple Grid services (GUMS, VOMS) and their associated back-end MySQL databases. This resiliency is introduced in front of the external application servers and also for the back-end database systems, which is what makes it "multi-tier". The combination of solutions chosen to ensure high availability of the services, in particular the database replication and fail-over mechanism, are discussed in detail. The paper explains the design and configuration of the overall system, including virtual servers, machine pools, and health monitors (which govern routing), as well as the master-slave database scheme and fail-over policies and procedures. Pre-deployment planning and stress testing will be outlined. Integration of the systems with our Nagios-based facility monitoring and alerting is also described. And application characteristics of GUMS and VOMS which enable effective clustering will be explained. We then summarize our practical experiences and real-world scenarios resulting from operating a major US Grid center, and assess the applicability of our approach to other Grid services in the future.
The Mechanism of Gene Targeting in Human Somatic Cells
Kan, Yinan; Ruis, Brian; Lin, Sherry; Hendrickson, Eric A.
2014-01-01
Gene targeting in human somatic cells is of importance because it can be used to either delineate the loss-of-function phenotype of a gene or correct a mutated gene back to wild-type. Both of these outcomes require a form of DNA double-strand break (DSB) repair known as homologous recombination (HR). The mechanism of HR leading to gene targeting, however, is not well understood in human cells. Here, we demonstrate that a two-end, ends-out HR intermediate is valid for human gene targeting. Furthermore, the resolution step of this intermediate occurs via the classic DSB repair model of HR while synthesis-dependent strand annealing and Holliday Junction dissolution are, at best, minor pathways. Moreover, and in contrast to other systems, the positions of Holliday Junction resolution are evenly distributed along the homology arms of the targeting vector. Most unexpectedly, we demonstrate that when a meganuclease is used to introduce a chromosomal DSB to augment gene targeting, the mechanism of gene targeting is inverted to an ends-in process. Finally, we demonstrate that the anti-recombination activity of mismatch repair is a significant impediment to gene targeting. These observations significantly advance our understanding of HR and gene targeting in human cells. PMID:24699519
Supporting Ecological Research With a Flexible Satellite Sensornet Gateway
NASA Astrophysics Data System (ADS)
Silva, F.; Rundel, P. W.; Graham, E. A.; Falk, A.; Ye, W.; Pradkin, Y.; Deschon, A.; Bhatt, S.; McHenry, T.
2007-12-01
Wireless sensor networks are a promising technology for ecological research due to their capability to make continuous and in-situ measurements. However, there are some challenges for the wide adoption of this technology by scientists, who may have various research focuses. First, the observation system needs to be rapidly and easily deployable at different remote locations. Second, the system needs to be flexible enough to meet the requirements of different applications and easily reconfigurable by scientists, who may not always be technology experts. To address these challenges, we designed and implemented a flexible satellite gateway for using sensor networks. Our first prototype is being deployed at Stunt Ranch in the Santa Monica Mountains to support biological research at UCLA. In this joint USC/ISI-UCLA deployment, scientists are interested in a long-term investigation of the influence of the 2006-07 southern California drought conditions on the water relations of important chaparral shrub and tree species that differ in their depth of rooting. Rainfall over this past hydrologic year in southern California has been less than 25% of normal, making it the driest year on record. In addition to core measurements of air temperature, relative humidity, wind speed, solar irradiance, rainfall, and soil moisture, we use constant-heating sap flow sensors to continuously monitor the flow of water through the xylem of replicated stems of four species to compare their access to soil moisture with plant water stress. Our gateway consists of a front-end data acquisition system and a back-end data storage system, connected by a long-haul satellite communication link. At the front-end, all environmental sensors are connected to a Compact RIO, a rugged data acquisition platform developed by National Instruments. Sap flow sensors are deployed in several locations that are 20 to 50 meters away from the Compact RIO. At each plant, a Hobo datalogger is used to collect sap flow sensor readings. A Crossbow mote interfaces with the Hobo datalogger to collect data from it and send the data to the Compact RIO through wireless communication. The Compact RIO relays the sensor data to the back- end system over the satellite link. The back-end system stores the data in a database and provides interfaces for easy data retrieval and system reconfiguration. We have developed data exchange and management protocols for reliable data transfer and storage. We have also developed tools to support remote operation, such as system health monitoring and user reconfiguration. Our design emphasizes a modular software architecture that is flexible, to support various scientific applications. This poster illustrates our system design and describes our first deployment at Stunt Ranch. Stunt Ranch is a 310-acre reserve in the Santa Monica Mountains, located within the Santa Monica Mountains National Recreation Area of the National Park Service. The reserve includes mixed communities of chaparral, live oak woodland, and riparian habitats. Stunt Ranch is managed by UCLA as part of the University of California Natural Reserve System.
Paging memory from random access memory to backing storage in a parallel computer
Archer, Charles J; Blocksome, Michael A; Inglett, Todd A; Ratterman, Joseph D; Smith, Brian E
2013-05-21
Paging memory from random access memory (`RAM`) to backing storage in a parallel computer that includes a plurality of compute nodes, including: executing a data processing application on a virtual machine operating system in a virtual machine on a first compute node; providing, by a second compute node, backing storage for the contents of RAM on the first compute node; and swapping, by the virtual machine operating system in the virtual machine on the first compute node, a page of memory from RAM on the first compute node to the backing storage on the second compute node.
Szmalec, Arnaud; Vandierendonck, André
2007-08-01
The present study proposes a new executive task, the one-back choice reaction time (RT) task, and implements the selective interference paradigm to estimate the executive demands of the processing components involved in this task. Based on the similarities between a one-back choice RT task and the n-back updating task, it was hypothesized that one-back delaying of a choice reaction involves executive control. In three experiments, framed within Baddeley's (1986) working-memory model, a one-back choice RT task, a choice RT task, articulatory suppression, and matrix tapping were performed concurrently with primary tasks involving verbal, visuospatial, and executive processing. The results demonstrate that one-back delaying of a choice reaction interferes with tasks requiring executive control, while the potential interference at the level of the verbal or visuospatial working memory slave systems remains minimal.
The MeqTrees software system and its use for third-generation calibration of radio interferometers
NASA Astrophysics Data System (ADS)
Noordam, J. E.; Smirnov, O. M.
2010-12-01
Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.
Engineering of Data Acquiring Mobile Software and Sustainable End-User Applications
NASA Technical Reports Server (NTRS)
Smith, Benton T.
2013-01-01
The criteria for which data acquiring software and its supporting infrastructure should be designed should take the following two points into account: the reusability and organization of stored online and remote data and content, and an assessment on whether abandoning a platform optimized design in favor for a multi-platform solution significantly reduces the performance of an end-user application. Furthermore, in-house applications that control or process instrument acquired data for end-users should be designed with a communication and control interface such that the application's modules can be reused as plug-in modular components in greater software systems. The application of the above mentioned is applied using two loosely related projects: a mobile application, and a website containing live and simulated data. For the intelligent devices mobile application AIDM, the end-user interface have a platform and data type optimized design, while the database and back-end applications store this information in an organized manner and manage access to that data to only to authorized user end application(s). Finally, the content for the website was derived from a database such that the content can be included and uniform to all applications accessing the content. With these projects being ongoing, I have concluded from my research that the applicable methods presented are feasible for both projects, and that a multi-platform design for the mobile application only marginally drop the performance of the mobile application.
Semantic Repositories for eGovernment Initiatives: Integrating Knowledge and Services
NASA Astrophysics Data System (ADS)
Palmonari, Matteo; Viscusi, Gianluigi
In recent years, public sector investments in eGovernment initiatives have depended on making more reliable existing governmental ICT systems and infrastructures. Furthermore, we assist at a change in the focus of public sector management, from the disaggregation, competition and performance measurements typical of the New Public Management (NPM), to new models of governance, aiming for the reintegration of services under a new perspective in bureaucracy, namely a holistic approach to policy making which exploits the extensive digitalization of administrative operations. In this scenario, major challenges are related to support effective access to information both at the front-end level, by means of highly modular and customizable content provision, and at the back-end level, by means of information integration initiatives. Repositories of information about data and services that exploit semantic models and technologies can support these goals by bridging the gap between the data-level representations and the human-level knowledge involved in accessing information and in searching for services. Moreover, semantic repository technologies can reach a new level of automation for different tasks involved in interoperability programs, both related to data integration techniques and service-oriented computing approaches. In this chapter, we discuss the above topics by referring to techniques and experiences where repositories based on conceptual models and ontologies are used at different levels in eGovernment initiatives: at the back-end level to produce a comprehensive view of the information managed in the public administrations' (PA) information systems, and at the front-end level to support effective service delivery.
NASA Astrophysics Data System (ADS)
Park, Sungkyung; Park, Chester Sungchung
2018-03-01
A composite radio receiver back-end and digital front-end, made up of a delta-sigma analogue-to-digital converter (ADC) with a high-speed low-noise sampling clock generator, and a fractional sample rate converter (FSRC), is proposed and designed for a multi-mode reconfigurable radio. The proposed radio receiver architecture contributes to saving the chip area and thus lowering the design cost. To enable inter-radio access technology handover and ultimately software-defined radio reception, a reconfigurable radio receiver consisting of a multi-rate ADC with its sampling clock derived from a local oscillator, followed by a rate-adjustable FSRC for decimation, is designed. Clock phase noise and timing jitter are examined to support the effectiveness of the proposed radio receiver. A FSRC is modelled and simulated with a cubic polynomial interpolator based on Lagrange method, and its spectral-domain view is examined in order to verify its effect on aliasing, nonlinearity and signal-to-noise ratio, giving insight into the design of the decimation chain. The sampling clock path and the radio receiver back-end data path are designed in a 90-nm CMOS process technology with 1.2V supply.
Foundation: Transforming data bases into knowledge bases
NASA Technical Reports Server (NTRS)
Purves, R. B.; Carnes, James R.; Cutts, Dannie E.
1987-01-01
One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.
Minimising back reflections from the common path objective in a fundus camera
NASA Astrophysics Data System (ADS)
Swat, A.
2016-11-01
Eliminating back reflections is critical in the design of a fundus camera with internal illuminating system. As there is very little light reflected from the retina, even excellent antireflective coatings are not sufficient suppression of ghost reflections, therefore the number of surfaces in the common optics in illuminating and imaging paths shall be minimised. Typically a single aspheric objective is used. In the paper an alternative approach, an objective with all spherical surfaces, is presented. As more surfaces are required, more sophisticated method is needed to get rid of back reflections. Typically back reflections analysis, comprise treating subsequent objective surfaces as mirrors, and reflections from the objective surfaces are traced back through the imaging path. This approach can be applied in both sequential and nonsequential ray tracing. It is good enough for system check but not very suitable for early optimisation process in the optical system design phase. There are also available standard ghost control merit function operands in the sequential ray-trace, for example in Zemax system, but these don't allow back ray-trace in an alternative optical path, illumination vs. imaging. What is proposed in the paper, is a complete method to incorporate ghost reflected energy into the raytracing system merit function for sequential mode which is more efficient in optimisation process. Although developed for the purpose of specific case of fundus camera, the method might be utilised in a wider range of applications where ghost control is critical.
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
Claus, R.
2015-10-23
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQmore » building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. Furthermore, the full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.« less
A new ATLAS muon CSC readout system with system on chip technology on ATCA platform
NASA Astrophysics Data System (ADS)
Claus, R.; ATLAS Collaboration
2016-07-01
The ATLAS muon Cathode Strip Chamber (CSC) back-end readout system has been upgraded during the LHC 2013-2015 shutdown to be able to handle the higher Level-1 trigger rate of 100 kHz and the higher occupancy at Run 2 luminosity. The readout design is based on the Reconfiguration Cluster Element (RCE) concept for high bandwidth generic DAQ implemented on the ATCA platform. The RCE design is based on the new System on Chip Xilinx Zynq series with a processor-centric architecture with ARM processor embedded in FPGA fabric and high speed I/O resources together with auxiliary memories to form a versatile DAQ building block that can host applications tapping into both software and firmware resources. The Cluster on Board (COB) ATCA carrier hosts RCE mezzanines and an embedded Fulcrum network switch to form an online DAQ processing cluster. More compact firmware solutions on the Zynq for G-link, S-link and TTC allowed the full system of 320 G-links from the 32 chambers to be processed by 6 COBs in one ATCA shelf through software waveform feature extraction to output 32 S-links. The full system was installed in Sept. 2014. We will present the RCE/COB design concept, the firmware and software processing architecture, and the experience from the intense commissioning towards LHC Run 2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lippek, H.E.; Schuller, C.R.
1979-03-01
A study was conducted to identify major legal and institutional problems and issues in the transportation of spent fuel and associated processing wastes at the back end of the LWR nuclear fuel cycle. (Most of the discussion centers on the transportation of spent fuel, since this activity will involve virtually all of the legal and institutional problems likely to be encountered in moving waste materials, as well.) Actions or approaches that might be pursued to resolve the problems identified in the analysis are suggested. Two scenarios for the industrial-scale transportation of spent fuel and radioactive wastes, taken together, high-light mostmore » of the major problems and issues of a legal and institutional nature that are likely to arise: (1) utilizing the Allied General Nuclear Services (AGNS) facility at Barnwell, SC, as a temporary storage facility for spent fuel; and (2) utilizing AGNS for full-scale commercial reprocessing of spent LWR fuel.« less
Kepler Mission: End-to-End System Demonstration
NASA Technical Reports Server (NTRS)
Borucki, William; Koch, D.; Dunham, E.; Jenkins, J.; Witteborn, F.; Updike, T.; DeVincenzi, Donald L. (Technical Monitor)
2000-01-01
A test facility has been constructed to demonstrate the capability of differential ensemble photometry to detect transits of Earth-size planets orbiting solar-like stars. The main objective is to determine the effects of various noise sources on the capability of a CCD photometer to maintain a system relative precision of 1 x $10^(-5)$ for mv = 12 stars in the presence of system-induced noise sources. The facility includes a simulated star field, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft and computers to perform the onboard control, data processing and extraction. The test structure is thermally and mechanically isolated so that each source of noise can be introduced in a controlled fashion and evaluated for its contribution to the total noise budget. The effects of pointing errors or a changing thermal environment are imposed by piezo-electric devices. Transits are injected by heating small wires crossing apertures in the star plate. Signals as small as those from terrestrial-size transits of solar-like stars are introduced to demonstrate that such planets can be detected under realistic noise conditions. Examples of imposing several noise sources and the resulting detectabilities are presented. These show that a differential ensemble photometric approach CCD photometer can readily detect signals associated with Earth-size transits.
Design of a web portal for interdisciplinary image retrieval from multiple online image resources.
Kammerer, F J; Frankewitsch, T; Prokosch, H-U
2009-01-01
Images play an important role in medicine. Finding the desired images within the multitude of online image databases is a time-consuming and frustrating process. Existing websites do not meet all the requirements for an ideal learning environment for medical students. This work intends to establish a new web portal providing a centralized access point to a selected number of online image databases. A back-end system locates images on given websites and extracts relevant metadata. The images are indexed using UMLS and the MetaMap system provided by the US National Library of Medicine. Specially developed functions allow to create individual navigation structures. The front-end system suits the specific needs of medical students. A navigation structure consisting of several medical fields, university curricula and the ICD-10 was created. The images may be accessed via the given navigation structure or using different search functions. Cross-references are provided by the semantic relations of the UMLS. Over 25,000 images were identified and indexed. A pilot evaluation among medical students showed good first results concerning the acceptance of the developed navigation structures and search features. The integration of the images from different sources into the UMLS semantic network offers a quick and an easy-to-use learning environment.
Future Tense: Lessons from the Best and Worst Cases in Afghanistan from Pakistan’s Perspective
2017-03-01
10 to 15 or more years, would likely be the key to success in the Afghanistan end game . Otherwise, Afghanistan will slip back into a situation much...more years, would likely be the key to success in the Afghanistan end game . Otherwise, Afghanistan will slip back into a situation much like the...treasure hunt, or what has been referred to as the “New Great Game .” The history of Afghanistan reaches back into centuries—and is full of wars. As
High sensitivity pH sensing on the BEOL of industrial FDSOI transistors
NASA Astrophysics Data System (ADS)
Rahhal, Lama; Ayele, Getenet Tesega; Monfray, Stéphane; Cloarec, Jean-Pierre; Fornacciari, Benjamin; Pardoux, Eric; Chevalier, Celine; Ecoffey, Serge; Drouin, Dominique; Morin, Pierre; Garnier, Philippe; Boeuf, Frederic; Souifi, Abdelkader
2017-08-01
In this work we demonstrate the use of Fully Depleted Silicon On Insulator (FDSOI) transistors as pH sensors with a 23 nm silicon nitride sensing layer built in the Back-End-Of-Line (BEOL). The back end process to deposit the sensing layer and fabricate the electrical structures needed for testing is detailed. A series of tests employing different pH buffer solutions has been performed on transistors of different geometries, controlled via the back gate. The main findings show a shift of the drain current (ID) as a function of the back gate voltage (VB) when different pH buffer solutions are probed in the range of pH 6 to pH 8. This shift is observed at VB voltages swept from 0 V to 3 V, demonstrating the sensor operation at low voltage. A high sensitivity of up to 250 mV/pH unit (more than 4-fold larger than Nernstian response) is observed on FDSOI MOS transistors of 0.06 μm gate length and 0.08 μm gate width. She is currently working as a Postdoctoral researcher at Institut des nanotechnologies de Lyon in collaboration with STMicroelectronics and Université de Sherbrook (Canada) working on ;Integration of ultra-low-power gas and pH sensors with advanced technologies;. Her research interest includes selection, machining, optimisation and electrical characterisation of the sensitive layer for a low power consumption gas sensor based on advanced MOS transistors.
The Physicochemical Hydrodynamics of Vascular Plants
NASA Astrophysics Data System (ADS)
Stroock, Abraham D.; Pagay, Vinay V.; Zwieniecki, Maciej A.; Michele Holbrook, N.
2014-01-01
Plants live dangerously, but gracefully. To remain hydrated, they exploit liquid water in the thermodynamically metastable state of negative pressure, similar to a rope under tension. This tension allows them to pull water out of the soil and up to their leaves. When this liquid rope breaks, owing to cavitation, they catch the ends to keep it from unraveling and then bind it back together. In parallel, they operate a second vascular system for the circulation of metabolites though their tissues, this time with positive pressures and flow that passes from leaf to root. In this article, we review the current state of understanding of water management in plants with an emphasis on the rich coupling of transport phenomena, thermodynamics, and active biological processes. We discuss efforts to replicate plant function in synthetic systems and point to opportunities for physical scientists and engineers to benefit from and contribute to the study of plants.
TELICS—A Telescope Instrument Control System for Small/Medium Sized Astronomical Observatories
NASA Astrophysics Data System (ADS)
Srivastava, Mudit K.; Ramaprakash, A. N.; Burse, Mahesh P.; Chordia, Pravin A.; Chillal, Kalpesh S.; Mestry, Vilas B.; Das, Hillol K.; Kohok, Abhay A.
2009-10-01
For any modern astronomical observatory, it is essential to have an efficient interface between the telescope and its back-end instruments. However, for small and medium-sized observatories, this requirement is often limited by tight financial constraints. Therefore a simple yet versatile and low-cost control system is required for such observatories to minimize cost and effort. Here we report the development of a modern, multipurpose instrument control system TELICS (Telescope Instrument Control System) to integrate the controls of various instruments and devices mounted on the telescope. TELICS consists of an embedded hardware unit known as a common control unit (CCU) in combination with Linux-based data acquisition and user interface. The hardware of the CCU is built around the ATmega 128 microcontroller (Atmel Corp.) and is designed with a backplane, master-slave architecture. A Qt-based graphical user interface (GUI) has been developed and the back-end application software is based on C/C++. TELICS provides feedback mechanisms that give the operator good visibility and a quick-look display of the status and modes of instruments as well as data. TELICS has been used for regular science observations since 2008 March on the 2 m, f/10 IUCAA Telescope located at Girawali in Pune, India.
Wisdom Appliance Control System
NASA Astrophysics Data System (ADS)
Hendrick; Jheng, Jyun-Teng; Tsai, Chen-Chai; Liou, Jia-Wei; Wang, Zhi-Hao; Jong, Gwo-Jia
2017-07-01
Intelligent appliances wisdom involves security, home care, convenient and energy saving, but the home automation system is still one of the core unit, and also using micro-processing electronics technology to centralized and control the home electrical products and systems, such as: lighting, television, fan, air conditioning, stereo, it composed of front-controller systems and back-controller panels, user using front-controller to control command, and then through the back-controller to powered the device.
Ultrasound phase rotation beamforming on multi-core DSP.
Ma, Jieming; Karadayi, Kerem; Ali, Murtaza; Kim, Yongmin
2014-01-01
Phase rotation beamforming (PRBF) is a commonly-used digital receive beamforming technique. However, due to its high computational requirement, it has traditionally been supported by hardwired architectures, e.g., application-specific integrated circuits (ASICs) or more recently field-programmable gate arrays (FPGAs). In this study, we investigated the feasibility of supporting software-based PRBF on a multi-core DSP. To alleviate the high computing requirement, the analog front-end (AFE) chips integrating quadrature demodulation in addition to analog-to-digital conversion were defined and used. With these new AFE chips, only delay alignment and phase rotation need to be performed by DSP, substantially reducing the computational load. We implemented the delay alignment and phase rotation modules on a Texas Instruments C6678 DSP with 8 cores. We found it takes 200 μs to beamform 2048 samples from 64 channels using 2 cores. With 4 cores, 20 million samples can be beamformed in one second. Therefore, ADC frequencies up to 40 MHz with 2:1 decimation in AFE chips or up to 20 MHz with no decimation can be supported as long as the ADC-to-DSP I/O requirement can be met. The remaining 4 cores can work on back-end processing tasks and applications, e.g., color Doppler or ultrasound elastography. One DSP being able to handle both beamforming and back-end processing could lead to low-power and low-cost ultrasound machines, benefiting ultrasound imaging in general, particularly portable ultrasound machines. Copyright © 2013 Elsevier B.V. All rights reserved.
2014-01-01
Background Health consumers have moved away from a reliance on medical practitioner advice to more independent decision processes and so their information search processes have subsequently widened. This study examined how persons with back pain searched for alternative treatment types and service providers. That is, what information do they seek and how; what sources do they use and why; and by what means do they search for it? Methods 12 persons with back pain were interviewed. The method used was convergent interviewing. This involved a series of semi-structured questions to obtain open-ended answers. The interviewer analysed the responses and refined the questions after each interview, to converge on the dominant factors influencing decisions about treatment patterns. Results Persons with back pain mainly search their memories and use word of mouth (their doctor and friends) for information about potential treatments and service providers. Their search is generally limited due to personal, provider-related and information-supply reasons. However, they did want in-depth information about the alternative treatments and providers in an attempt to establish apriori their efficacy in treating their specific back problems. They searched different sources depending on the type of information they required. Conclusions The findings differ from previous studies about the types of information health consumers require when searching for information about alternative or mainstream healthcare services. The results have identified for the first time that limited information availability was only one of three categories of reasons identified about why persons with back pain do not search for more information particularly from external non-personal sources. PMID:24725300
Monitoring the Earth System Grid Federation through the ESGF Dashboard
NASA Astrophysics Data System (ADS)
Fiore, S.; Bell, G. M.; Drach, B.; Williams, D.; Aloisio, G.
2012-12-01
The Climate Model Intercomparison Project, phase 5 (CMIP5) is a global effort coordinated by the World Climate Research Programme (WCRP) involving tens of modeling groups spanning 19 countries. It is expected the CMIP5 distributed data archive will total upwards of 3.5 petabytes, stored across several ESGF Nodes on four continents (North America, Europe, Asia, and Australia). The Earth System Grid Federation (ESGF) provides the IT infrastructure to support the CMIP5. In this regard, the monitoring of the distributed ESGF infrastructure represents a crucial part carried out by the ESGF Dashboard. The ESGF Dashboard is a software component of the ESGF stack, responsible for collecting key information about the status of the federation in terms of: 1) Network topology (peer-groups composition), 2) Node type (host/services mapping), 3) Registered users (including their Identity Providers), 4) System metrics (e.g., round-trip time, service availability, CPU, memory, disk, processes, etc.), 5) Download metrics (both at the Node and federation level). The last class of information is very important since it provides a strong insight of the CMIP5 experiment: the data usage statistics. In this regard, CMCC and LLNL have developed a data analytics management system for the analysis of both node-level and federation-level data usage statistics. It provides data usage statistics aggregated by project, model, experiment, variable, realm, peer node, time, ensemble, datasetname (including version), etc. The back-end of the system is able to infer the data usage information of the entire federation, by carrying out: - at node level: a 18-step reconciliation process on the peer node databases (i.e. node manager and publisher DB) which provides a 15-dimension datawarehouse with local statistics and - at global level: an aggregation process which federates the data usage statistics into a 16-dimension datawarehouse with federation-level data usage statistics. The front-end of the Dashboard system exploits a web desktop approach, which joins the pervasivity of a web application with the flexibility of a desktop one.
Space Nuclear Thermal Propulsion Test Facilities Subpanel
NASA Technical Reports Server (NTRS)
Allen, George C.; Warren, John W.; Martinell, John; Clark, John S.; Perkins, David
1993-01-01
On 20 Jul. 1989, in commemoration of the 20th anniversary of the Apollo 11 lunar landing, President George Bush proclaimed his vision for manned space exploration. He stated, 'First for the coming decade, for the 1990's, Space Station Freedom, the next critical step in our space endeavors. And next, for the new century, back to the Moon. Back to the future. And this time, back to stay. And then, a journey into tomorrow, a journey to another planet, a manned mission to Mars.' On 2 Nov. 1989, the President approved a national space policy reaffirming the long range goal of the civil space program: to 'expand human presence and activity beyond Earth orbit into the solar system.' And on 11 May 1990, he specified the goal of landing Astronauts on Mars by 2019, the 50th anniversary of man's first steps on the Moon. To safely and ever permanently venture beyond near Earth environment as charged by the President, mankind must bring to bear extensive new technologies. These include heavy lift launch capability from Earth to low-Earth orbit, automated space rendezvous and docking of large masses, zero gravity countermeasures, and closed loop life support systems. One technology enhancing, and perhaps enabling, the piloted Mars missions is nuclear propulsion, with great benefits over chemical propulsion. Asserting the potential benefits of nuclear propulsion, NASA has sponsored workshops in Nuclear Electric Propulsion and Nuclear Thermal Propulsion and has initiated a tri-agency planning process to ensure that appropriate resources are engaged to meet this exciting technical challenge. At the core of this planning process, NASA, DOE, and DOD established six Nuclear Propulsion Technical Panels in 1991 to provide groundwork for a possible tri-agency Nuclear Propulsion Program and to address the President's vision by advocating an aggressive program in nuclear propulsion. To this end the Nuclear Electric Propulsion Technology Panel has focused it energies; this final report summarizes its endeavor and conclusions.
Flows of engineered nanomaterials through the recycling process in Switzerland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caballero-Guzman, Alejandro; Sun, Tianyin; Nowack, Bernd, E-mail: nowack@empa.ch
Highlights: • Recycling is one of the likely end-of-life fates of nanoproducts. • We assessed the material flows of four nanomaterials in the Swiss recycling system. • After recycling, most nanomaterials will flow to landfills or incineration plants. • Recycled construction waste, plastics and textiles may contain nanomaterials. - Abstract: The use of engineered nanomaterials (ENMs) in diverse applications has increased during the last years and this will likely continue in the near future. As the number of applications increase, more and more waste with nanomaterials will be generated. A portion of this waste will enter the recycling system, formore » example, in electronic products, textiles and construction materials. The fate of these materials during and after the waste management and recycling operations is poorly understood. The aim of this work is to model the flows of nano-TiO{sub 2}, nano-ZnO, nano-Ag and CNT in the recycling system in Switzerland. The basis for this study is published information on the ENMs flows on the Swiss system. We developed a method to assess their flow after recycling. To incorporate the uncertainties inherent to the limited information available, we applied a probabilistic material flow analysis approach. The results show that the recycling processes does not result in significant further propagation of nanomaterials into new products. Instead, the largest proportion will flow as waste that can subsequently be properly handled in incineration plants or landfills. Smaller fractions of ENMs will be eliminated or end up in materials that are sent abroad to undergo further recovery processes. Only a reduced amount of ENMs will flow back to the productive process of the economy in a limited number of sectors. Overall, the results suggest that risk assessment during recycling should focus on occupational exposure, release of ENMs in landfills and incineration plants, and toxicity assessment in a small number of recycled inputs.« less
2. JoAnn SieburgBaker, Photographer, September 1977. SECTION SHOWING BACK OF ...
2. JoAnn Sieburg-Baker, Photographer, September 1977. SECTION SHOWING BACK OF ROUNDHOUSE AND END OF BACK SHOP WHERE CRANE WAS LOCATED. - Southern Railway Company, Spencer Shops, Salisbury Avenue between Third and Eight Streets, Spencer, Rowan County, NC
SU-E-T-186: Cloud-Based Quality Assurance Application for Linear Accelerator Commissioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, J
2015-06-15
Purpose: To identify anomalies and safety issues during data collection and modeling for treatment planning systems Methods: A cloud-based quality assurance system (AQUIRE - Automated QUalIty REassurance) has been developed to allow the uploading and analysis of beam data aquired during the treatment planning system commissioning process. In addition to comparing and aggregating measured data, tools have also been developed to extract dose from the treatment planning system for end-to-end testing. A gamma index is perfomed on the data to give a dose difference and distance-to-agreement for validation that a beam model is generating plans consistent with the beam datamore » collection. Results: Over 20 linear accelerators have been commissioning using this platform, and a variety of errors and potential saftey issues have been caught through the validation process. For example, the gamma index of 2% dose, 2mm DTA is quite sufficient to see curves not corrected for effective point of measurement. Also, data imported into the database is analyzed against an aggregate of similar linear accelerators to show data points that are outliers. The resulting curves in the database exhibit a very small standard deviation and imply that a preconfigured beam model based on aggregated linear accelerators will be sufficient in most cases. Conclusion: With the use of this new platform for beam data commissioning, errors in beam data collection and treatment planning system modeling are greatly reduced. With the reduction in errors during acquisition, the resulting beam models are quite similar, suggesting that a common beam model may be possible in the future. Development is ongoing to create routine quality assurance tools to compare back to the beam data acquired during commissioning. I am a medical physicist for Alzyen Medical Physics, and perform commissioning services.« less
Feed-back between geriatric syndromes: general system theory in geriatrics.
Musso, Carlos G; Núñez, Juan F Macías
2006-01-01
Geriatrics has described three entities: confusional syndrome, incontinente and gait disorders, calling them geriatric giants. Aging process also induces changes in renal physiology such as glomerular filtration rate reduction, and alteration in water and electrolytes handling. These ageing renal changes have been named as nephrogeriatric giants. These two groups of giants, geriatric and nephrogeriatric, can predispose and potentiate each other leading old people to fatal outcomes. These phenomenon of feed-back between these geriatric syndromes has its roots in the loss of complexity that the ageing process has. Complexity means that all the body systems work harmoniously. The process of senescence weakens this coordination among systems undermining complexity and making the old person frail.
Concepts for thin-film GaAs concentrator cells. [for solar photovoltaic space power systems
NASA Technical Reports Server (NTRS)
Spitzer, M. B.; Gale, R. P.; Mcclelland, R.; King, B.; Dingle, J.
1989-01-01
The development of advanced GaAs concentrator solar cells, and in particular, the use of CLEFT (cleavage of lateral epitaxial films for transfer) processes for formation of thin-film structures is reported. The use of CLEFT has made possible processing of the back, and cells with back surface grids are discussed. Data on patterned junction development are presented; such junctions are expected to be useful in back surface applications requiring point contacts, grating structures, and interdigitated back contacts. CLEFT concentrator solar cells with grids on the front and back surfaces are reported here; these cells are 4 microns thick and are bonded to glass covers for support. Air mass zero efficiency of 18.8 percent has been obtained for a CLEFT concentrator operating at 18.5 suns.
An Autonomous Cryobot Synthetic Aperture Radar for Subsurface Exploration of Europa
NASA Astrophysics Data System (ADS)
Pradhan, O.; Gasiewski, A. J.
2015-12-01
We present the design and field testing of a forward-looking end-fire synthetic aperture radar (SAR) for the 'Very deep Autonomous Laser-powered Kilowatt-class Yo-yoing Robotic Ice Explorer' (VALKYRIE) ice-penetrating cryobot. This design demonstrates critical technologies that will support an eventual landing and ice penetrating mission to Jupiter's icy moon, Europa. Results proving the feasibility of an end-fire SAR system for vehicle guidance and obstacle avoidance in a sub-surface ice environment will be presented. Data collected by the SAR will also be used for constructing sub-surface images of the glacier which can be used for: (i) mapping of englacial features such as crevasses, moulins, and embedded liquid water and (ii) ice-depth and glacier bed analysis to construct digital elevation models (DEM) that can help in the selection of crybot trajectories and future drill sites for extracting long-term climate records. The project consists of three parts, (i) design of an array of four conformal cavity-backed log-periodic folded slot dipole array (LPFSA) antennas that form agile radiating elements, (ii) design of a radar system that includes RF signal generation, 4x4 transmit-receive antenna switching and isolation and digital SAR data processing and (iii) field testing of the SAR in melt holes. The antennas have been designed, fabricated, and lab tested at the Center for Environmental Technology (CET) at CU-Boulder. The radar system was also designed and integrated at CET utilizing rugged RF components and FPGA based digital processing. Field testing was performed in conjunction with VALKYRIE tests by Stone Aerospace in June, 2015 on Matanuska Glacier, Alaska. The antennas are designed to operate inside ice while being immersed in a thin layer of surrounding low-conductivity melt water. Small holes in the corners of the cavities allow flooding of these cavities with the same melt-water thus allowing for quarter-wavelength cavity-backed reflection. Testing of the antenna array was first carried out by characterizing their operation inside a large ice block at the Stone Aerospace facility in Austin, TX. The complete radar system was then tested on the Matanuska glacier in Alaska, which is an effective Earth analog to Europan sub-surface exploration.
Ergonomic risk factors of work processes in the semiconductor industry in Peninsular Malaysia.
Chee, Heng-Leng; Rampal, Krishna Gopal; Chandrasakaran, Abherhame
2004-07-01
A cross-sectional survey of semiconductor factories was conducted to identify the ergonomic risk factors in the work processes, the prevalence of body pain among workers, and the relationship between body pain and work processes. A total of 906 women semiconductor workers took part in the study. In wafer preparation and polishing, a combination of lifting weights and prolonged standing might have led to high pain prevalences in the low back (35.0% wafer preparation, 41.7% wafer polishing) and lower limbs (90.0% wafer preparation, 66.7% wafer polishing). Semiconductor front of line workers, who mostly walked around to operate machines in clean rooms, had the lowest prevalences of body pain. Semiconductor assembly middle of line workers, especially the molding workers, who did frequent lifting, had high pain prevalences in the neck/shoulders (54.8%) and upper back (43.5 %). In the semiconductor assembly end of line work section, chip inspection workers who were exposed to prolonged sitting without back support had high prevalences of neck/shoulder (62.2%) and upper back pain (50.0%), while chip testing workers who had to climb steps to load units had a high prevalence of lower limb pain (68.0%). Workers in the assembly of electronic components, carrying out repetitive tasks with hands and fingers, and standing in awkward postures had high pain prevalences in the neck/shoulders (61.5%), arms (38.5%), and hands/wrists (30.8%).
1976-03-01
RESEARCH IN FUNCTIONALLY DISTRIBUTED COMPUTER SYSTEMS DEVEI.OPME--ETClU) MAR 76 P S FISHER, F MARYANSKI DAA629-76-6-0108 UNCLASSIFIED CS-76-08AN...RESEARCH IN FUNCTIONALLY !DISTRIBUTED COMPUTER SYSTEMS DEVELOPMENT Kansas State University Virgil Wallentine Principal Investigator Approved for public...reme; disiribution unlimited DTIC \\4JWE III ELECTi"U ~E V0AI. Ill ~1ONTAUG 2 0 1981&EV .IAIN LiSP4 F U.S. ARMY COMPUTER SYSTEMS COMMAND FT BELVOIR, VA
Intrusion recognition for optic fiber vibration sensor based on the selective attention mechanism
NASA Astrophysics Data System (ADS)
Xu, Haiyan; Xie, Yingjuan; Li, Min; Zhang, Zhuo; Zhang, Xuewu
2017-11-01
Distributed fiber-optic vibration sensors receive extensive investigation and play a significant role in the sensor panorama. A fiber optic perimeter detection system based on all-fiber interferometric sensor is proposed, through the back-end analysis, processing and intelligent identification, which can distinguish effects of different intrusion activities. In this paper, an intrusion recognition based on the auditory selective attention mechanism is proposed. Firstly, considering the time-frequency of vibration, the spectrogram is calculated. Secondly, imitating the selective attention mechanism, the color, direction and brightness map of the spectrogram is computed. Based on these maps, the feature matrix is formed after normalization. The system could recognize the intrusion activities occurred along the perimeter sensors. Experiment results show that the proposed method for the perimeter is able to differentiate intrusion signals from ambient noises. What's more, the recognition rate of the system is improved while deduced the false alarm rate, the approach is proved by large practical experiment and project.
Virus Database and Online Inquiry System Based on Natural Vectors.
Dong, Rui; Zheng, Hui; Tian, Kun; Yau, Shek-Chung; Mao, Weiguang; Yu, Wenping; Yin, Changchuan; Yu, Chenglong; He, Rong Lucy; Yang, Jie; Yau, Stephen St
2017-01-01
We construct a virus database called VirusDB (http://yaulab.math.tsinghua.edu.cn/VirusDB/) and an online inquiry system to serve people who are interested in viral classification and prediction. The database stores all viral genomes, their corresponding natural vectors, and the classification information of the single/multiple-segmented viral reference sequences downloaded from National Center for Biotechnology Information. The online inquiry system serves the purpose of computing natural vectors and their distances based on submitted genomes, providing an online interface for accessing and using the database for viral classification and prediction, and back-end processes for automatic and manual updating of database content to synchronize with GenBank. Submitted genomes data in FASTA format will be carried out and the prediction results with 5 closest neighbors and their classifications will be returned by email. Considering the one-to-one correspondence between sequence and natural vector, time efficiency, and high accuracy, natural vector is a significant advance compared with alignment methods, which makes VirusDB a useful database in further research.
On Robust Methodologies for Managing Public Health Care Systems
Nimmagadda, Shastri L.; Dreher, Heinz V.
2014-01-01
Authors focus on ontology-based multidimensional data warehousing and mining methodologies, addressing various issues on organizing, reporting and documenting diabetic cases and their associated ailments, including causalities. Map and other diagnostic data views, depicting similarity and comparison of attributes, extracted from warehouses, are used for understanding the ailments, based on gender, age, geography, food-habits and other hereditary event attributes. In addition to rigor on data mining and visualization, an added focus is on values of interpretation of data views, from processed full-bodied diagnosis, subsequent prescription and appropriate medications. The proposed methodology, is a robust back-end application, for web-based patient-doctor consultations and e-Health care management systems through which, billions of dollars spent on medical services, can be saved, in addition to improving quality of life and average life span of a person. Government health departments and agencies, private and government medical practitioners including social welfare organizations are typical users of these systems. PMID:24445953
Fully Integrated Optical Spectrometer in Visible and Near-IR in CMOS.
Hong, Lingyu; Sengupta, Kaushik
2017-12-01
Optical spectrometry in the visible and near-infrared range has a wide range of applications in healthcare, sensing, imaging, and diagnostics. This paper presents the first fully integrated optical spectrometer in standard bulk CMOS process without custom fabrication, postprocessing, or any external optical passive structure such as lenses, gratings, collimators, or mirrors. The architecture exploits metal interconnect layers available in CMOS processes with subwavelength feature sizes to guide, manipulate, control, diffract light, integrated photodetector, and read-out circuitry to detect dispersed light, and then back-end signal processing for robust spectral estimation. The chip, realized in bulk 65-nm low power-CMOS process, measures 0.64 mm 0.56 mm in active area, and achieves 1.4 nm in peak detection accuracy for continuous wave excitations between 500 and 830 nm. This paper demonstrates the ability to use these metal-optic nanostructures to miniaturize complex optical instrumentation into a new class of optics-free CMOS-based systems-on-chip in the visible and near-IR for various sensing and imaging applications.
Bilateral Impedance Control For Telemanipulators
NASA Technical Reports Server (NTRS)
Moore, Christopher L.
1993-01-01
Telemanipulator system includes master robot manipulated by human operator, and slave robot performing tasks at remote location. Two robots electronically coupled so slave robot moves in response to commands from master robot. Teleoperation greatly enhanced if forces acting on slave robot fed back to operator, giving operator feeling he or she manipulates remote environment directly. Main advantage of bilateral impedance control: enables arbitrary specification of desired performance characteristics for telemanipulator system. Relationship between force and position modulated at both ends of system to suit requirements of task.
Missile signal processing common computer architecture for rapid technology upgrade
NASA Astrophysics Data System (ADS)
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
2004-10-01
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.
Lyakh, A.; Maulini, R.; Tsekoun, A.; Go, R.; Von der Porten, S.; Pflügl, C.; Diehl, L.; Capasso, Federico; Patel, C. Kumar N.
2010-01-01
A strain-balanced, AlInAs/InGaAs/InP quantum cascade laser structure, designed for light emission at 4.0 μm using nonresonant extraction design approach, was grown by molecular beam epitaxy. Laser devices were processed in buried heterostructure geometry. An air-cooled laser system incorporating a 10-mm × 11.5-μm laser with antireflection-coated front facet and high-reflection-coated back facet delivered over 2 W of single-ended optical power in a collimated beam. Maximum continuous-wave room temperature wall plug efficiency of 5.0% was demonstrated for a high-reflection-coated 3.65-mm × 8.7-μm laser mounted on an aluminum nitride submount.
Using incident response trees as a tool for risk management of online financial services.
Gorton, Dan
2014-09-01
The article introduces the use of probabilistic risk assessment for modeling the incident response process of online financial services. The main contribution is the creation of incident response trees, using event tree analysis, which provides us with a visual tool and a systematic way to estimate the probability of a successful incident response process against the currently known risk landscape, making it possible to measure the balance between front-end and back-end security measures. The model is presented using an illustrative example, and is then applied to the incident response process of a Swedish bank. Access to relevant data is verified and the applicability and usability of the proposed model is verified using one year of historical data. Potential advantages and possible shortcomings are discussed, referring to both the design phase and the operational phase, and future work is presented. © 2014 Society for Risk Analysis.
Delta-Doped Back-Illuminated CMOS Imaging Arrays: Progress and Prospects
NASA Technical Reports Server (NTRS)
Hoenk, Michael E.; Jones, Todd J.; Dickie, Matthew R.; Greer, Frank; Cunningham, Thomas J.; Blazejewski, Edward; Nikzad, Shouleh
2009-01-01
In this paper, we report the latest results on our development of delta-doped, thinned, back-illuminated CMOS imaging arrays. As with charge-coupled devices, thinning and back-illumination are essential to the development of high performance CMOS imaging arrays. Problems with back surface passivation have emerged as critical to the prospects for incorporating CMOS imaging arrays into high performance scientific instruments, just as they did for CCDs over twenty years ago. In the early 1990's, JPL developed delta-doped CCDs, in which low temperature molecular beam epitaxy was used to form an ideal passivation layer on the silicon back surface. Comprising only a few nanometers of highly-doped epitaxial silicon, delta-doping achieves the stability and uniformity that are essential for high performance imaging and spectroscopy. Delta-doped CCDs were shown to have high, stable, and uniform quantum efficiency across the entire spectral range from the extreme ultraviolet through the near infrared. JPL has recently bump-bonded thinned, delta-doped CMOS imaging arrays to a CMOS readout, and demonstrated imaging. Delta-doped CMOS devices exhibit the high quantum efficiency that has become the standard for scientific-grade CCDs. Together with new circuit designs for low-noise readout currently under development, delta-doping expands the potential scientific applications of CMOS imaging arrays, and brings within reach important new capabilities, such as fast, high-sensitivity imaging with parallel readout and real-time signal processing. It remains to demonstrate manufacturability of delta-doped CMOS imaging arrays. To that end, JPL has acquired a new silicon MBE and ancillary equipment for delta-doping wafers up to 200mm in diameter, and is now developing processes for high-throughput, high yield delta-doping of fully-processed wafers with CCD and CMOS imaging devices.
Optimization of digitization procedures in cultural heritage preservation
NASA Astrophysics Data System (ADS)
Martínez, Bea; Mitjà, Carles; Escofet, Jaume
2013-11-01
The digitization of both volumetric and flat objects is the nowadays-preferred method in order to preserve cultural heritage items. High quality digital files obtained from photographic plates, films and prints, paintings, drawings, gravures, fabrics and sculptures, allows not only for a wider diffusion and on line transmission, but also for the preservation of the original items from future handling. Early digitization procedures used scanners for flat opaque or translucent objects and camera only for volumetric or flat highly texturized materials. The technical obsolescence of the high-end scanners and the improvement achieved by professional cameras has result in a wide use of cameras with digital back to digitize any kind of cultural heritage item. Since the lens, the digital back, the software controlling the camera and the digital image processing provide a wide range of possibilities, there is necessary to standardize the methods used in the reproduction work leading to preserve as high as possible the original item properties. This work presents an overview about methods used for camera system characterization, as well as the best procedures in order to identify and counteract the effect of the lens residual aberrations, sensor aliasing, image illumination, color management and image optimization by means of parametric image processing. As a corollary, the work shows some examples of reproduction workflow applied to the digitization of valuable art pieces and glass plate photographic black and white negatives.
Volcanism in the Bransfield Strait, Antarctica
NASA Astrophysics Data System (ADS)
Fisk, M. R.
Back-arc and marginal basins make up a significant portion of the earth's crust and they can represent the transition from continental to oceanic crust. The Bransfield Strait is a young marginal basin of the arc-trench system that lies off the northwestern edge of the Antarctic Peninsula. The strait is about 65 km wide and has a maximum water depth of 2000 m. "Active" volcanoes in the Bransfield Strait include two seamounts, which are south of the eastern end of King George Island, and three island volcanoes — Penguin, Deception, and Bridgeman Islands. Alkaline and calc-alkaline suites occur on these islands, and the seamounts are composed of tholeiites and basaltic andesites. This diversity is similar to that found in some back-arc basins, but the Bransfield Strait basalts as a group cannot be classified as back-arc basin or island-arc basalts. The diverse rock types and the chemical similarity of some of the Bransfield Strait basalts to ophiolite basalts suggests that some ophiolites were generated in back-arc basins.
Hydrogen and elemental carbon production from natural gas and other hydrocarbons
Detering, Brent A.; Kong, Peter C.
2002-01-01
Diatomic hydrogen and unsaturated hydrocarbons are produced as reactor gases in a fast quench reactor. During the fast quench, the unsaturated hydrocarbons are further decomposed by reheating the reactor gases. More diatomic hydrogen is produced, along with elemental carbon. Other gas may be added at different stages in the process to form a desired end product and prevent back reactions. The product is a substantially clean-burning hydrogen fuel that leaves no greenhouse gas emissions, and elemental carbon that may be used in powder form as a commodity for several processes.
Simplified Distributed Computing
NASA Astrophysics Data System (ADS)
Li, G. G.
2006-05-01
The distributed computing runs from high performance parallel computing, GRID computing, to an environment where idle CPU cycles and storage space of numerous networked systems are harnessed to work together through the Internet. In this work we focus on building an easy and affordable solution for computationally intensive problems in scientific applications based on existing technology and hardware resources. This system consists of a series of controllers. When a job request is detected by a monitor or initialized by an end user, the job manager launches the specific job handler for this job. The job handler pre-processes the job, partitions the job into relative independent tasks, and distributes the tasks into the processing queue. The task handler picks up the related tasks, processes the tasks, and puts the results back into the processing queue. The job handler also monitors and examines the tasks and the results, and assembles the task results into the overall solution for the job request when all tasks are finished for each job. A resource manager configures and monitors all participating notes. A distributed agent is deployed on all participating notes to manage the software download and report the status. The processing queue is the key to the success of this distributed system. We use BEA's Weblogic JMS queue in our implementation. It guarantees the message delivery and has the message priority and re-try features so that the tasks never get lost. The entire system is built on the J2EE technology and it can be deployed on heterogeneous platforms. It can handle algorithms and applications developed in any languages on any platforms. J2EE adaptors are provided to manage and communicate the existing applications to the system so that the applications and algorithms running on Unix, Linux and Windows can all work together. This system is easy and fast to develop based on the industry's well-adopted technology. It is highly scalable and heterogeneous. It is an open system and any number and type of machines can join the system to provide the computational power. This asynchronous message-based system can achieve second of response time. For efficiency, communications between distributed tasks are often done at the start and end of the tasks but intermediate status of the tasks can also be provided.
Interactive, process-oriented climate modeling with CLIMLAB
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2016-12-01
Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The Jupyter Notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields.
Mendes, André Augusto M T; de Freitas, Sandra Maria Sbeghen Ferreira; Amorin, César Ferreira; Cabral, Cristina Maria Nunes; Padula, Rosimeire Simprini
2018-02-06
This study aimed to evaluate the effect of one workday on pain and perceived exertion, muscular strength, and electromyographic activity of the erector spinae muscles in welders with and without low back pain. This is an observational cohort study. Twenty-two welders, metallurgical shipbuilding, were equally divided into 2 groups: low back pain and no low back pain. Pain and perceived exertion. Muscular strength by maximal voluntary contractions and electromyographic activity of right and left erector spinae muscles during maximal voluntary contractions and in the 3 welding positions for 2 periods of the workday (in the morning and at the end of the workday). At the end of workday, the pain increased significantly for the low back pain group (t(22) = 2.448; P= 0.023). The perceived exertion also increased significantly for both groups at the end of workday groups (F(1,22) = 8.570, P= 0.000) and periods (F(1,22) = 8.142, P= 0.000). There were no significant differences between groups and workday periods for muscular strength and electromyographic activity during maximal voluntary contractions of the erector spinae. There was no significance difference for electromyographic activity between groups and workday period and in the 3 welding positions. Although the pain and perceived exertion increased at the end of the workday, these results did not interfere in muscular strength and electromyographic activity of right and left erector spinae muscles. Thus, we can conclude that welders with chronic low back pain had a good physical capacity (muscular strength) and that muscle performance was maintained.
Schwartz, Yannick; Barbot, Alexis; Thyreau, Benjamin; Frouin, Vincent; Varoquaux, Gaël; Siram, Aditya; Marcus, Daniel S; Poline, Jean-Baptiste
2012-01-01
As neuroimaging databases grow in size and complexity, the time researchers spend investigating and managing the data increases to the expense of data analysis. As a result, investigators rely more and more heavily on scripting using high-level languages to automate data management and processing tasks. For this, a structured and programmatic access to the data store is necessary. Web services are a first step toward this goal. They however lack in functionality and ease of use because they provide only low-level interfaces to databases. We introduce here PyXNAT, a Python module that interacts with The Extensible Neuroimaging Archive Toolkit (XNAT) through native Python calls across multiple operating systems. The choice of Python enables PyXNAT to expose the XNAT Web Services and unify their features with a higher level and more expressive language. PyXNAT provides XNAT users direct access to all the scientific packages in Python. Finally PyXNAT aims to be efficient and easy to use, both as a back-end library to build XNAT clients and as an alternative front-end from the command line.
Li, Zhao; Li, Jin; Yu, Peng
2018-01-01
Abstract Metadata curation has become increasingly important for biological discovery and biomedical research because a large amount of heterogeneous biological data is currently freely available. To facilitate efficient metadata curation, we developed an easy-to-use web-based curation application, GEOMetaCuration, for curating the metadata of Gene Expression Omnibus datasets. It can eliminate mechanical operations that consume precious curation time and can help coordinate curation efforts among multiple curators. It improves the curation process by introducing various features that are critical to metadata curation, such as a back-end curation management system and a curator-friendly front-end. The application is based on a commonly used web development framework of Python/Django and is open-sourced under the GNU General Public License V3. GEOMetaCuration is expected to benefit the biocuration community and to contribute to computational generation of biological insights using large-scale biological data. An example use case can be found at the demo website: http://geometacuration.yubiolab.org. Database URL: https://bitbucket.com/yubiolab/GEOMetaCuration PMID:29688376
Schwartz, Yannick; Barbot, Alexis; Thyreau, Benjamin; Frouin, Vincent; Varoquaux, Gaël; Siram, Aditya; Marcus, Daniel S.; Poline, Jean-Baptiste
2012-01-01
As neuroimaging databases grow in size and complexity, the time researchers spend investigating and managing the data increases to the expense of data analysis. As a result, investigators rely more and more heavily on scripting using high-level languages to automate data management and processing tasks. For this, a structured and programmatic access to the data store is necessary. Web services are a first step toward this goal. They however lack in functionality and ease of use because they provide only low-level interfaces to databases. We introduce here PyXNAT, a Python module that interacts with The Extensible Neuroimaging Archive Toolkit (XNAT) through native Python calls across multiple operating systems. The choice of Python enables PyXNAT to expose the XNAT Web Services and unify their features with a higher level and more expressive language. PyXNAT provides XNAT users direct access to all the scientific packages in Python. Finally PyXNAT aims to be efficient and easy to use, both as a back-end library to build XNAT clients and as an alternative front-end from the command line. PMID:22654752
System for inspection of stacked cargo containers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Derenzo, Stephen
The present invention relates to a system for inspection of stacked cargo containers. One embodiment of the invention generally comprises a plurality of stacked cargo containers arranged in rows or tiers, each container having a top, a bottom a first side, a second side, a front end, and a back end; a plurality of spacers arranged in rows or tiers; one or more mobile inspection devices for inspecting the cargo containers, wherein the one or more inspection devices are removeably disposed within the spacers, the inspection means configured to move through the spacers to detect radiation within the containers. Themore » invented system can also be configured to inspect the cargo containers for a variety of other potentially hazardous materials including but not limited to explosive and chemical threats.« less
Back-Up/ Peak Shaving Fuel Cell System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudt, Rhonda L.
2008-05-28
This Final Report covers the work executed by Plug Power from 8/11/03 – 10/31/07 statement of work for Topic 2: advancing the state of the art of fuel cell technology with the development of a new generation of commercially viable, stationary, Back-up/Peak-Shaving fuel cell systems, the GenCore II. The Program cost was $7.2 M with the Department of Energy share being $3.6M and Plug Power’s share being $3.6 M. The Program started in August of 2003 and was scheduled to end in January of 2006. The actual program end date was October of 2007. A no cost extension was grated.more » The Department of Energy barriers addressed as part of this program are: Technical Barriers for Distributed Generation Systems: o Durability o Power Electronics o Start up time Technical Barriers for Fuel Cell Components: o Stack Material and Manufacturing Cost o Durability o Thermal and water management Background The next generation GenCore backup fuel cell system to be designed, developed and tested by Plug Power under the program is the first, mass-manufacturable design implementation of Plug Power’s GenCore architected platform targeted for battery and small generator replacement applications in the telecommunications, broadband and UPS markets. The next generation GenCore will be a standalone, H2 in-DC-out system. In designing the next generation GenCore specifically for the telecommunications market, Plug Power is teaming with BellSouth Telecommunications, Inc., a leading industry end user. The final next generation GenCore system is expected to represent a market-entry, mass-manufacturable and economically viable design. The technology will incorporate: • A cost-reduced, polymer electrolyte membrane (PEM) fuel cell stack tailored to hydrogen fuel use • An advanced electrical energy storage system • A modular, scalable power conditioning system tailored to market requirements • A scaled-down, cost-reduced balance of plant (BOP) • Network Equipment Building Standards (NEBS), UL and CE certifications.« less
Caprioglio, Alberto; Beretta, Matteo; Lanteri, Claudio
2011-01-01
To compare the dento-alveolar and skeletal effects produced by two different molar intraoral distalization appliances, Pendulum and Fast-Back, both followed by fixed appliances, in the treatment of Class II malocclusion. 41 patients for Pendulum (18 males and 23 females) and 35 for Fast-Back (14 males and 21 females) were selected, with a mean age at the start of treatment of 12.11 years in the Pendulum group and 13.3 for in the Fast-Back group. The durations of the distalization phase were 8 months in the Pendulum group and 9 months in the Fast-Back group, and the durations of the second phase of treatment with fixed appliances were 19 months in the Pendulum group and 20 months in the Fast-Back group. Lateral cephalograms were analyzed at 3 observation times: before treatment, after distalization and after comprehensive orthodontic treatment. During molar distalization the Pendulum subjects showed greater distal molar movement and less anchorage loss at both the premolars and maxillary incisors than the Fast-Back subjects. Pendulum and Fast-Back produced similar amounts of distal molar movement and overcorrection of molar relationship at the end of distalization though the Fast-Back induced a more bodily movement. Very little change occurred in the inclination of the mandibular plane at the end of the 2-phase treatment in both groups. At the end of treatment the maxillary first molars were on average 1mm more distal in the Pendulum group compared to the Fast-Back group, while the total molar correction was 3.2mm with 3.9° of distal inclination for the Pendulum and 2mm with 1.1° of mesial inclination for the Fast-Back. Both appliance were equally effective in inducing a satisfactory Class I relationship in 97.2% of the cases. The Pendulum and the Fast-Back induce similar dentoskeletal effects. The use of the two distalization devices, therefore, can be considered clinically equivalent. Copyright © 2011 Società Italiana di Ortodonzia SIDO. Published by Elsevier Srl. All rights reserved.
Analysis of fractionation in corn-to-ethanol plants
NASA Astrophysics Data System (ADS)
Nelson, Camille
As the dry grind ethanol industry has grown, the research and technology surrounding ethanol production and co-product value has increased. Including use of back-end oil extraction and front-end fractionation. Front-end fractionation is pre-fermentation separation of the corn kernel into 3 fractions: endosperm, bran, and germ. The endosperm fraction enters the existing ethanol plant, and a high protein DDGS product remains after fermentation. High value oil is extracted out of the germ fraction. This leaves corn germ meal and bran as co-products from the other two streams. These 3 co-products have a very different composition than traditional corn DDGS. Installing this technology allows ethanol plants to increase profitability by tapping into more diverse markets, and ultimately could allow for an increase in profitability. An ethanol plant model was developed to evaluate both back-end oil extraction and front-end fractionation technology and predict the change in co-products based on technology installed. The model runs in Microsoft Excel and requires inputs of whole corn composition (proximate analysis), amino acid content, and weight to predict the co-product quantity and quality. User inputs include saccharification and fermentation efficiencies, plant capacity, and plant process specifications including front-end fractionation and backend oil extraction, if applicable. This model provides plants a way to assess and monitor variability in co-product composition due to the variation in whole corn composition. Additionally the co-products predicted in this model are entered into the US Pork Center of Excellence, National Swine Nutrition Guide feed formulation software. This allows the plant user and animal nutritionists to evaluate the value of new co-products in existing animal diets.
EB-Family Proteins: Functions and Microtubule Interaction Mechanisms.
Mustyatsa, V V; Boyakhchyan, A V; Ataullakhanov, F I; Gudimchuk, N B
2017-07-01
Microtubules are polymers of tubulin protein, one of the key components of cytoskeleton. They are polar filaments whose plus-ends usually oriented toward the cell periphery are more dynamic than their minus-ends, which face the center of the cell. In cells, microtubules are organized into a network that is being constantly rebuilt and renovated due to stochastic switching of its individual filaments from growth to shrinkage and back. Because of these dynamics and their mechanical properties, microtubules take part in various essential processes, from intracellular transport to search and capture of chromosomes during mitosis. Microtubule dynamics are regulated by many proteins that are located on the plus-ends of these filaments. One of the most important and abundant groups of plus-end-interacting proteins are EB-family proteins, which autonomously recognize structures of the microtubule growing plus-ends, modulate their dynamics, and recruit multiple partner proteins with diverse functions onto the microtubule plus-ends. In this review, we summarize the published data about the properties and functions of EB-proteins, focusing on analysis of their mechanism of interaction with the microtubule growing ends.
Field experiment on CO2 back-production at the Ketzin pilot site
NASA Astrophysics Data System (ADS)
Martens, Sonja; Möller, Fabian; Schmidt-Hattenberger, Cornelia; Streibel, Martin; Szizybalski, Alexandra; Liebscher, Axel
2015-04-01
The operational phase of the Ketzin pilot site for geological CO2 storage in Germany started in June 2008 and ended in August 2013. Over the period of approximately five years, a total amount of 67 kt of CO2 was successfully injected into a saline aquifer (Upper Triassic sandstone) at a depth of 630 m - 650 m. The CO2 used was mainly of food grade quality. In addition, 1.5 kt of CO2 from the pilot capture facility "Schwarze Pumpe" (lignite power plant CO2) was used in 2011. At the end of the injection period, 32 t N2 and 613 t CO2 were co-injected during a four-week field test in July and August 2013. In October 2014, a field experiment was carried out at Ketzin with the aim to back-produce parts of the injected CO2 during a two-week period. This experiment addressed two main questions: (i) How do reservoir and wellbore behave during back-production of CO2? and (ii) What is the composition of the CO2 and the co-produced formation fluid? The back-production was carried out through the former injection well. It was conducted continuously over the first week and with an alternating regime including production during day-time and shut-ins during night-time in the second week. During the test, a total amount of 240 t of CO2 and 57 m3 of brine were safely back-produced from the reservoir. Production rates up to 3,200 kg/h - which corresponds to the former highest injection rate - could be tested. Vital monitoring parameters included production rates of CO2 and brine, wellhead and bottomhole pressure and temperature at the production and observation wells and distributed temperature sensing (DTS) along the production well. A permanently installed geoelectrical array was used for crosshole electrical resistivity tomography (ERT) monitoring of the reservoir. Formation fluid and gas samples were collected and analysed. The measured compositions allow studying the geochemical interactions between CO2, formation fluid and rocks under in-situ conditions The field experiment indicates that a safe back-production of CO2 is generally feasible and can be performed at both, stable reservoir and wellbore conditions. ERT monitoring shows that the geoelectrical array at the production well was capable of tracking the back-production process, e.g. the back-flow of brine into the parts formerly filled with CO2. Preliminary results also show that the back-produced CO2 at Ketzin has a purity > 97 per cent. Secondary component in the CO2 stream is N2 with < 3 per cent which probably results from former injection operation and field tests. The results will help to verify geochemical laboratory experiments which are typically performed in simplified synthetic systems. The results gained at the Ketzin site refer to the pilot scale. Upscaling of the results to industrial scale is possible but must first be tested and validated at demo projects.
Design and Development of the SMAP Microwave Radiometer Electronics
NASA Technical Reports Server (NTRS)
Piepmeier, Jeffrey R.; Medeiros, James J.; Horgan, Kevin A.; Brambora, Clifford K.; Estep, Robert H.
2014-01-01
The SMAP microwave radiometer will measure land surface brightness temperature at L-band (1413 MHz) in the presence of radio frequency interference (RFI) for soil moisture remote sensing. The radiometer design was driven by the requirements to incorporate internal calibration, to operate synchronously with the SMAP radar, and to mitigate the deleterious effects of RFI. The system design includes a highly linear super-heterodyne microwave receiver with internal reference loads and noise sources for calibration and an innovative digital signal processor and detection system. The front-end comprises a coaxial cable-based feed network, with a pair of diplexers and a coupled noise source, and radiometer front-end (RFE) box. Internal calibration is provided by reference switches and a common noise source inside the RFE. The RF back-end (RBE) downconverts the 1413 MHz channel to an intermediate frequency (IF) of 120 MHz. The IF signals are then sampled and quantized by high-speed analog-to-digital converters in the radiometer digital electronics (RDE) box. The RBE local oscillator and RDE sampling clocks are phase-locked to a common reference to ensure coherency between the signals. The RDE performs additional filtering, sub-band channelization, cross-correlation for measuring third and fourth Stokes parameters, and detection and integration of the first four raw moments of the signals. These data are packetized and sent to the ground for calibration and further processing. Here we discuss the novel features of the radiometer hardware particularly those influenced by the need to mitigate RFI.
Effects of verbal and nonverbal interference on spatial and object visual working memory.
Postle, Bradley R; Desposito, Mark; Corkin, Suzanne
2005-03-01
We tested the hypothesis that a verbal coding mechanism is necessarily engaged by object, but not spatial, visual working memory tasks. We employed a dual-task procedure that paired n-back working memory tasks with domain-specific distractor trials inserted into each interstimulus interval of the n-back tasks. In two experiments, object n-back performance demonstrated greater sensitivity to verbal distraction, whereas spatial n-back performance demonstrated greater sensitivity to motion distraction. Visual object and spatial working memory may differ fundamentally in that the mnemonic representation of featural characteristics of objects incorporates a verbal (perhaps semantic) code, whereas the mnemonic representation of the location of objects does not. Thus, the processes supporting working memory for these two types of information may differ in more ways than those dictated by the "what/where" organization of the visual system, a fact more easily reconciled with a component process than a memory systems account of working memory function.
Effects of verbal and nonverbal interference on spatial and object visual working memory
POSTLE, BRADLEY R.; D’ESPOSITO, MARK; CORKIN, SUZANNE
2005-01-01
We tested the hypothesis that a verbal coding mechanism is necessarily engaged by object, but not spatial, visual working memory tasks. We employed a dual-task procedure that paired n-back working memory tasks with domain-specific distractor trials inserted into each interstimulus interval of the n-back tasks. In two experiments, object n-back performance demonstrated greater sensitivity to verbal distraction, whereas spatial n-back performance demonstrated greater sensitivity to motion distraction. Visual object and spatial working memory may differ fundamentally in that the mnemonic representation of featural characteristics of objects incorporates a verbal (perhaps semantic) code, whereas the mnemonic representation of the location of objects does not. Thus, the processes supporting working memory for these two types of information may differ in more ways than those dictated by the “what/where” organization of the visual system, a fact more easily reconciled with a component process than a memory systems account of working memory function. PMID:16028575
New Decision Support for Landslide and Other Disaster Events
NASA Astrophysics Data System (ADS)
Nair, U. S.; Keiser, K.; Wu, Y.; Kaulfus, A.; Srinivasan, K.; Anderson, E. R.; McEniry, M.
2013-12-01
An Event-Driven Data delivery (ED3) framework has been created that provides reusable services and configurations to support better data preparedness for decision support of disasters and other events by rapidly providing pre-planned access to data, special processing, modeling and other capabilities, all executed in response to criteria-based events. ED3 facilitates decision makers to plan in advance of disasters and other types of events for the data necessary for decisions and response activities. A layer of services provided in the ED3 framework allows systems to support user definition of subscriptions for data plans that will be triggered when events matching specified criteria occur. Pre-planning for data in response to events lessens the burden on decision makers in the aftermath of an event and allows planners to think through the desired processing for specialized data products. Additionally the ED3 framework provides support for listening for event alerts and support for multiple workflow managers that provide data and processing functionality in response to events. Landslides are often costly and, at times, deadly disaster events. Whereas intense and/or sustained rainfall is often the primary trigger for landslides, soil type and slope are also important factors in determining the location and timing of slope failure. Accounting for the substantial spatial variability of these factors is one of the major difficulties when predicting the timing and location of slope failures. A wireless sensor network (WSN), developed by NASA SERVIR and USRA, with peer-to-peer communication capability and low power consumption, is ideal for high spatial in situ monitoring in remote locations. In collaboration with the University of Huntsville at Alabama, WSN equipped with accelerometer, rainfall and soil moisture sensors is being integrated into an end-to-end landslide warning system. The WSN is being tested to ascertain communication capabilities and the density of nodes required depending upon the nature of terrain and land cover. The performance of a water table model, to be utilized in the end-to-end system, is being evaluated by comparing against landslides that occurred during the 6th and 7th of May, 2003 and 20th and 21st of April, 2011. The model provides a deterministic assessment of slope stability by evaluating horizontal and vertical transport of underground water and associated weight bearing capacity. In the proposed end-to-end system, the model will be coupled to the WSN, and the in situ data collected will be used to drive the model. The output from the model could be communicated back to the WSN providing the capability of generating warning of possible events to the ED3 framework to trigger additional data retrieval or the processing of additional models based on decision maker's ED3 preparedness plans. NASA's Applied Science Program has funded a feasibility study of the ED3 technology and as a result the capability is on track be integrated into existing decision support systems, with an initial reference implementation hosted at the Global Hydrology Resource Center, a NASA distributed active archive center (DAAC).
A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits.
Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo
2017-12-01
One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe 2 , a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.
A MoTe2-based light-emitting diode and photodetector for silicon photonic integrated circuits
NASA Astrophysics Data System (ADS)
Bie, Ya-Qing; Grosso, Gabriele; Heuck, Mikkel; Furchi, Marco M.; Cao, Yuan; Zheng, Jiabao; Bunandar, Darius; Navarro-Moratalla, Efren; Zhou, Lin; Efetov, Dmitri K.; Taniguchi, Takashi; Watanabe, Kenji; Kong, Jing; Englund, Dirk; Jarillo-Herrero, Pablo
2017-12-01
One of the current challenges in photonics is developing high-speed, power-efficient, chip-integrated optical communications devices to address the interconnects bottleneck in high-speed computing systems. Silicon photonics has emerged as a leading architecture, in part because of the promise that many components, such as waveguides, couplers, interferometers and modulators, could be directly integrated on silicon-based processors. However, light sources and photodetectors present ongoing challenges. Common approaches for light sources include one or few off-chip or wafer-bonded lasers based on III-V materials, but recent system architecture studies show advantages for the use of many directly modulated light sources positioned at the transmitter location. The most advanced photodetectors in the silicon photonic process are based on germanium, but this requires additional germanium growth, which increases the system cost. The emerging two-dimensional transition-metal dichalcogenides (TMDs) offer a path for optical interconnect components that can be integrated with silicon photonics and complementary metal-oxide-semiconductors (CMOS) processing by back-end-of-the-line steps. Here, we demonstrate a silicon waveguide-integrated light source and photodetector based on a p-n junction of bilayer MoTe2, a TMD semiconductor with an infrared bandgap. This state-of-the-art fabrication technology provides new opportunities for integrated optoelectronic systems.
LDEF Retrieval over the Namib Desert, Namibia, Africa
1990-01-20
STS032-85-029 (12 Jan. 1990) --- (ORIENT PHOTO WITH COLUMBIA'S CARGO BAY IN LOWER CENTER). This 70mm frame was taken during a battery of documentary photographs of the recently-recaptured Long Duration Exposure Facility (LEDF). The Atlantic Coast of Namibia serves as a backdrop for the colorful scene. After five-and-one half years orbiting Earth, LDEF was retrieved by STS-32 crewmembers and brought back home at the end of the eleven-day mission for scientific observation. The bus-sized spacecraft was held in the grasp of Columbia's remote manipulator system (RMS) end effector during the survey.
Hagman, Ingela; Tegern, Matthias; Broman, Lisbet; Larsson, Helena
2018-01-01
Background Musculoskeletal complaints and injuries (MSCI) are common in military populations. However, only a limited number of studies have followed soldiers during international deployments and investigated the prevalence of MSCI during and at the end of their deployment. The aim was to describe the prevalence of MSCI in different military occupational specialties and categorise their most common tasks in terms of exposures to physical workloads during a six-month long international deployment in Afghanistan. Methods Cross-sectional survey, including 325 soldiers (300 men), aged 20–62 participating in an international deployment in Afghanistan during the spring of 2012. Soldiers were clustered into different military occupational specialties: Infantry, Administration, Logistics, Logistics/Camp, Medical and Other. Data were collected through the use of the Musculoskeletal Screening Protocol at the end of the international mission. Results Forty-seven percent reported MSCI during deployment, with 28% at the end. The most common locations of MSCI during the mission were lower back, knee, shoulders, upper back, neck and foot, while the knee and lower back prevailed at the end of the mission. Almost half of the soldiers who had MSCI reported affected work ability. The most common duties during the mission were vehicle patrolling, staff duties, guard/security duties, foot patrols and transportation. Soldiers reported that vehicle patrolling, staff duties and transportation were demanding with respect to endurance strength, guard/security duties challenged both maximum and endurance strength while foot patrolling challenged maximum and endurance strength, aerobic and anaerobic endurance and speed. Conclusions MSCI during international deployment are common among Swedish soldiers. The results indicate the need to further develop strategies focusing on matching the soldiers’ capacity to the job requirements, with relevant and fair physical selection-tests during the recruitment process and proactive interventions targeting MSCI before and during deployment, in order to enhance soldiers’ readiness and promote operational readiness. PMID:29621324
Frank Gilbreth and health care delivery method study driven learning.
Towill, Denis R
2009-01-01
The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised in the "learn-do-learn-do" feedback loop in the Gilbreth Knowledge Flywheel.
Acoustic system for material transport
NASA Technical Reports Server (NTRS)
Barmatz, M. B.; Trinh, E. H.; Wang, T. G.; Elleman, D. D.; Jacobi, N. (Inventor)
1983-01-01
An object within a chamber is acoustically moved by applying wavelengths of different modes to the chamber to move the object between pressure wells formed by the modes. In one system, the object is placed in one end of the chamber while a resonant mode, applied along the length of the chamber, produces a pressure well at the location. The frequency is then switched to a second mode that produces a pressure well at the center of the chamber, to draw the object. When the object reaches the second pressure well and is still traveling towards the second end of the chamber, the acoustic frequency is again shifted to a third mode (which may equal the first model) that has a pressure well in the second end portion of the chamber, to draw the object. A heat source may be located near the second end of the chamber to heat the sample, and after the sample is heated it can be cooled by moving it in a corresponding manner back to the first end of the chamber. The transducers for levitating and moving the object may be all located at the cool first end of the chamber.
Recent advancements in low cost solar cell processing
NASA Technical Reports Server (NTRS)
Ralph, E. L.
1975-01-01
A proof-of-concept solar cell process has been developed that is adaptable to automation. This involved the development of a new contact system, a new antireflection coating system, a drift field cell design and a new contoured surface treatment. All these processes are performed without the use of vacuum chambers and expensive masking techniques, thus providing the possibility of reduced costs by automation using conventional semiconductor processing machinery. The contacts were printed on the cells by conventional silk screen machinery. The P(+) back field was formed by diffusing in aluminum from a printed aluminum back contact. The antireflection coating was formed by spinning on and baking a TiO2-SiO2 glass film. Air-mass-zero efficiencies of over 10% were achieved using this completely vacuum-free process.
Realization of back-side heterogeneous hybrid III-V/Si DBR lasers for silicon photonics
NASA Astrophysics Data System (ADS)
Durel, Jocelyn; Ferrotti, Thomas; Chantre, Alain; Cremer, Sébastien; Harduin, Julie; Bernabé, Stéphane; Kopp, Christophe; Boeuf, Frédéric; Ben Bakir, Badhise; Broquin, Jean-Emmanuel
2016-02-01
In this paper, the simulation, design and fabrication of a back-side coupling (BSC) concept for silicon photonics, which targets heterogeneous hybrid III-V/Si laser integration is presented. Though various demonstrations of a complete SOI integration of passive and active photonic devices have been made, they all feature multi-level planar metal interconnects, and a lack of integrated light sources. This is mainly due to the conflict between the need of planar surfaces for III-V/Si bonding and multiple levels of metallization. The proposed BSC solution to this topographical problem consists in fabricating lasers on the back-side of the Si waveguides using a new process sequence. The devices are based on a hybrid structure composed of an InGaAsP MQW active area and a Si-based DBR cavity. The emitted light wavelength is accordable within a range of 20 nm around 1.31μm thanks to thermal heaters and the laser output is fiber coupled through a Grating Coupler (GC). From a manufacturing point of view, the BSC approach provides not only the advantages of allowing the use of a thin-BOX SOI instead of a thick one; but it also shifts the laser processing steps and their materials unfriendly to CMOS process to the far back-end areas of fabrication lines. Moreover, aside from solving technological integration issues, the BSC concept offers several new design opportunities for active and passive devices (heat sink, Bragg gratings, grating couplers enhanced with integrated metallic mirrors, tapers…). These building boxes are explored here theoretically and experimentally.
Miniaturized Airborne Imaging Central Server System
NASA Technical Reports Server (NTRS)
Sun, Xiuhong
2011-01-01
In recent years, some remote-sensing applications require advanced airborne multi-sensor systems to provide high performance reflective and emissive spectral imaging measurement rapidly over large areas. The key or unique problem of characteristics is associated with a black box back-end system that operates a suite of cutting-edge imaging sensors to collect simultaneously the high throughput reflective and emissive spectral imaging data with precision georeference. This back-end system needs to be portable, easy-to-use, and reliable with advanced onboard processing. The innovation of the black box backend is a miniaturized airborne imaging central server system (MAICSS). MAICSS integrates a complex embedded system of systems with dedicated power and signal electronic circuits inside to serve a suite of configurable cutting-edge electro- optical (EO), long-wave infrared (LWIR), and medium-wave infrared (MWIR) cameras, a hyperspectral imaging scanner, and a GPS and inertial measurement unit (IMU) for atmospheric and surface remote sensing. Its compatible sensor packages include NASA s 1,024 1,024 pixel LWIR quantum well infrared photodetector (QWIP) imager; a 60.5 megapixel BuckEye EO camera; and a fast (e.g. 200+ scanlines/s) and wide swath-width (e.g., 1,920+ pixels) CCD/InGaAs imager-based visible/near infrared reflectance (VNIR) and shortwave infrared (SWIR) imaging spectrometer. MAICSS records continuous precision georeferenced and time-tagged multisensor throughputs to mass storage devices at a high aggregate rate, typically 60 MB/s for its LWIR/EO payload. MAICSS is a complete stand-alone imaging server instrument with an easy-to-use software package for either autonomous data collection or interactive airborne operation. Advanced multisensor data acquisition and onboard processing software features have been implemented for MAICSS. With the onboard processing for real time image development, correction, histogram-equalization, compression, georeference, and data organization, fast aerial imaging applications, including the real time LWIR image mosaic for Google Earth, have been realized for NASA fs LWIR QWIP instrument. MAICSS is a significant improvement and miniaturization of current multisensor technologies. Structurally, it has a complete modular and solid-state design. Without rotating hard drives and other moving parts, it is operational at high altitudes and survivable in high-vibration environments. It is assembled from a suite of miniaturized, precision-machined, standardized, and stackable interchangeable embedded instrument modules. These stackable modules can be bolted together with the interconnection wires inside for the maximal simplicity and portability. Multiple modules are electronically interconnected as stacked. Alternatively, these dedicated modules can be flexibly distributed to fit the space constraints of a flying vehicle. As a flexibly configurable system, MAICSS can be tailored to interface a variety of multisensor packages. For example, with a 1,024x1,024 pixel LWIR and a 8,984x6,732 pixel EO payload, the complete MAICSS volume is approximately 7x9x11 in. (=18x23x28 cm), with a weight of 25 lb (=11.4 kg).
Advancing the science of forensic data management
NASA Astrophysics Data System (ADS)
Naughton, Timothy S.
2002-07-01
Many individual elements comprise a typical forensics process. Collecting evidence, analyzing it, and using results to draw conclusions are all mutually distinct endeavors. Different physical locations and personnel are involved, juxtaposed against an acute need for security and data integrity. Using digital technologies and the Internet's ubiquity, these diverse elements can be conjoined using digital data as the common element. This result is a new data management process that can be applied to serve all elements of the community. The first step is recognition of a forensics lifecycle. Evidence gathering, analysis, storage, and use in legal proceedings are actually just distinct parts of a single end-to-end process, and thus, it is hypothesized that a single data system that can also accommodate each constituent phase using common network and security protocols. This paper introduces the idea of web-based Central Data Repository. Its cornerstone is anywhere, anytime Internet upload, viewing, and report distribution. Archives exist indefinitely after being created, and high-strength security and encryption protect data and ensure subsequent case file additions do not violate chain-of-custody or other handling provisions. Several legal precedents have been established for using digital information in courts of law, and in fact, effective prosecution of cyber crimes absolutely relies on its use. An example is a US Department of Agriculture division's use of digital images to back up its inspection process, with pictures and information retained on secure servers to enforce the Perishable Agricultural Commodities Act. Forensics is a cumulative process. Secure, web-based data management solutions, such as the Central Data Repository postulated here, can support each process step. Logically marrying digital technologies with Internet accessibility should help nurture a thought process to explore alternatives that make forensics data accessible to authorized individuals, whenever and wherever they need it.
[Description of the mental processes occurring during clinical reasoning].
Pottier, P; Planchon, B
2011-06-01
Clinical reasoning is a highly complex system with multiple inter-dependent mental activities. Gaining a better understanding of those cognitive processes has two practical implications: for physicians, being able to analyse their own reasoning method may prove to be helpful in diagnostic dead end; for medical teachers, identifying problem-solving strategies used by medical students may foster an appropriate individual feed-back aiming at improving their clinical reasoning skills. On the basis of a detailed literature review, the main diagnostic strategies and their related pattern of mental processes are described and illustrated with a concrete example, going from the patient's complaint to the chosen solution. Inductive, abductive and deductive diagnostic approaches are detailed. Different strategies for collecting data (exhaustive or oriented) and for problem-building are described. The place of problem solving strategies such as pattern-recognition, scheme inductive process, using of clinical script, syndrome grouping and mental hypotheses test is considered. This work aims at breaking up mental activities in process within clinical reasoning reminding that expert reasoning is characterised by the ability to use and structure the whole of these activities in a coherent system, using combined strategies in order to guarantee a better accuracy of their diagnosis. Copyright © 2010 Société nationale française de médecine interne (SNFMI). Published by Elsevier SAS. All rights reserved.
A synthetic polymer system with repeatable chemical recyclability
NASA Astrophysics Data System (ADS)
Zhu, Jian-Bo; Watson, Eli M.; Tang, Jing; Chen, Eugene Y.-X.
2018-04-01
The development of chemically recyclable polymers offers a solution to the end-of-use issue of polymeric materials and provides a closed-loop approach toward a circular materials economy. However, polymers that can be easily and selectively depolymerized back to monomers typically require low-temperature polymerization methods and also lack physical properties and mechanical strengths required for practical uses. We introduce a polymer system based on γ-butyrolactone (GBL) with a trans-ring fusion at the α and β positions. Such trans-ring fusion renders the commonly considered as nonpolymerizable GBL ring readily polymerizable at room temperature under solvent-free conditions to yield a high–molecular weight polymer. The polymer has enhanced thermostability and can be repeatedly and quantitatively recycled back to its monomer by thermolysis or chemolysis. Mixing of the two enantiomers of the polymer generates a highly crystalline supramolecular stereocomplex.
Evolution of filmless PACS in Korea
NASA Astrophysics Data System (ADS)
Choi, Hyung-Sik
2002-05-01
The growth of PACS (Picture Archiving and Communications System) market in Korea over the past 10 years is a brilliant development. In order to reach these brilliant achievements, the efforts of the Korean Society of PACS, the supports of the government on the information technology industry and the efforts of PACS companies in market expansion were all served as vital manures of the sowing time. By the end of 2001, 21% of the total Korean hospitals were under the clinical operation using filmless full PACS and it is believed to be the first incident in the world. The purpose of this paper is to look back upon the growing process of filmless PACS in Korea and analyze the cause of this tremendous growth. I believe that the Korean PACS experience would be helpful to many PACS experts who pray for a proliferation of PACS distribution.
Plasma process control with optical emission spectroscopy
NASA Astrophysics Data System (ADS)
Ward, P. P.
Plasma processes for cleaning, etching and desmear of electronic components and printed wiring boards (PWB) are difficult to predict and control. Non-uniformity of most plasma processes and sensitivity to environmental changes make it difficult to maintain process stability from day to day. To assure plasma process performance, weight loss coupons or post-plasma destructive testing must be used. The problem with these techniques is that they are not real-time methods and do not allow for immediate diagnosis and process correction. These methods often require scrapping some fraction of a batch to insure the integrity of the rest. Since these methods verify a successful cycle with post-plasma diagnostics, poor test results often determine that a batch is substandard and the resulting parts unusable. Both of these methods are a costly part of the overall fabrication cost. A more efficient method of testing would allow for constant monitoring of plasma conditions and process control. Process failures should be detected before the parts being treated. are damaged. Real time monitoring would allow for instantaneous corrections. Multiple site monitoring would allow for process mapping within one system or simultaneous monitoring of multiple systems. Optical emission spectroscopy conducted external to the plasma apparatus would allow for this sort of multifunctional analysis without perturbing the glow discharge. In this paper, optical emission spectroscopy for non-intrusive, in situ process control will be explored. A discussion of this technique as it applies towards process control, failure analysis and endpoint determination will be conducted. Methods for identifying process failures, progress and end of etch back and desmear processes will be discussed.
Compact Fiber-Parametric Devices for Biophotonics Applications
2012-03-01
coming in the fiber from the pump overlap temporally and spatially with the pulses fed back from a Fabry -Perot cavity (Sharping, 2010). Fiber optical...Some laser systems such as the Nd:YAG system used in this study, uses a Fabry -Perot cavity in which two mirrors are arranged parallel to one another... Fabry -Perot cavity formed between one end of the PCF and a metallic mirror (M3). The output coupler is a short-pass dielectric (SPD) or a long-pass
The use of PDAs to collect baseline survey data: lessons learned from a pilot project in Bolivia.
Escandon, I N; Searing, H; Goldberg, R; Duran, R; Arce, J Monterrey
2008-01-01
We compared the use of personal digital assistants (PDAs) against the use of standard paper questionnaires for collecting survey data. The evaluation consisted of qualitative approaches to document the process of introducing PDAs. Fieldwork was carried out during June-July 2005 at 12 sites in Bolivia. Data collectors reacted positively to the use of the PDAs and noted the advantages and disadvantages of paper and PDA data collection. A number of difficulties encountered in the use of PDA technology serve as a warning for investigators planning its adoption. Problems included incompatible data files (which impeded the ability to interpret data), an inadequate back-up protocol, and lack of a good 'fit' between the technology and the study. Ensuring the existence of a back-end database, developing an appropriate and adequate back-up protocol, and assessing whether a technology 'fits' the project are important factors in weighing the decision to collect data using PDAs.
Fighting back against America's public health enemy number one.
Spickard, W. A.; Dixon, G. L.; Sarver, F. W.
1994-01-01
Fighting Back is a comprehensive substance abuse program operating in 14 communities spread throughout the United States. The Robert Wood Johnson Foundation has committed more than $45 million over a 7-year period to plan and implement innovative, community-wide initiatives in Columbia, SC; Charlotte, NC; Kansas City, Mo; Little Rock, Ark; Northwest New Mexico; Milwaukee, Wis; New Haven, Conn; Newark, NJ; Oakland, Calif; San Antonio, Tex; Santa Barbara, Calif; Vallejo, Calif; Washington, DC; and Worcester, Mass. In this article the work in progress at the end of 18 months of a 5-year implementation program in each site is reported. A Fighting Back National Program Office operates from a base at Vanderbilt University Medical Center in Nashville, Tenn. The senior staff of this office highlights the process that has unfolded to date, describes some of the sources of encouragement, and discusses some of the critical issues and sources of concern. A "Call to Action" on the part of the federal government is included. PMID:8069272
Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy
2016-03-01
to be used as a decision - making aid to guide system designers and program managers not necessarily familiar with cognitive pro- cessing, or resource...implementing end-to-end cognitive processing flows multiplies and the impact of these design decisions on efficiency and effectiveness increases [1]. The...end-to-end cognitive systems and alternative computing technologies, then system design and acquisition personnel could make systematic analyses and
NASA Astrophysics Data System (ADS)
Swastika, Windra
2017-03-01
A money's nominal value recognition system has been developed using Artificial Neural Network (ANN). ANN with Back Propagation has one disadvantage. The learning process is very slow (or never reach the target) in the case of large number of iteration, weight and samples. One way to speed up the learning process is using Quickprop method. Quickprop method is based on Newton's method and able to speed up the learning process by assuming that the weight adjustment (E) is a parabolic function. The goal is to minimize the error gradient (E'). In our system, we use 5 types of money's nominal value, i.e. 1,000 IDR, 2,000 IDR, 5,000 IDR, 10,000 IDR and 50,000 IDR. One of the surface of each nominal were scanned and digitally processed. There are 40 patterns to be used as training set in ANN system. The effectiveness of Quickprop method in the ANN system was validated by 2 factors, (1) number of iterations required to reach error below 0.1; and (2) the accuracy to predict nominal values based on the input. Our results shows that the use of Quickprop method is successfully reduce the learning process compared to Back Propagation method. For 40 input patterns, Quickprop method successfully reached error below 0.1 for only 20 iterations, while Back Propagation method required 2000 iterations. The prediction accuracy for both method is higher than 90%.
A Note on Knowledge in the Schooled Society: Towards an End to the Crisis in Curriculum Theory
ERIC Educational Resources Information Center
Baker, David P.
2015-01-01
Michael Young's recent paper in this journal is correct; there is a profound crisis in curriculum theory, and to be intellectually viable into the future the field must strive to "bring back" in empirical study of curriculum. Also by ignoring the empirical content of knowledge and access to it in mass education systems throughout the…
Detering, Brent A.; Kong, Peter C.
2001-01-01
Carbon monoxide is produced in a fast quench reactor. The production of carbon monoxide includes injecting carbon dioxide and some air into a reactor chamber having a high temperature at its inlet and a rapidly expanding a reactant stream, such as a restrictive convergent-divergent nozzle at its outlet end. Carbon dioxide and other reactants such as methane and other low molecular weight hydrocarbons are injected into the reactor chamber. Other gas may be added at different stages in the process to form a desired end product and prevent back reactions. The resulting heated gaseous stream is then rapidly cooled by expansion of the gaseous stream.
NASA Technical Reports Server (NTRS)
Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.
2013-01-01
In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).
Evaluation of Apache Hadoop for parallel data analysis with ROOT
NASA Astrophysics Data System (ADS)
Lehrack, S.; Duckeck, G.; Ebke, J.
2014-06-01
The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.
LWAs computational platform for e-consultation using mobile devices: cases from developing nations.
Olajubu, Emmanuel Ajayi; Odukoya, Oluwatoyin Helen; Akinboro, Solomon Adegbenro
2014-01-01
Mobile devices have been impacting on human standard of living by providing timely and accurate information anywhere and anytime through wireless media in developing nations. Shortage of experts in medical fields is very obvious throughout the whole world but more pronounced in developing nations. Thus, this study proposes a telemedicine platform for the vulnerable areas of developing nations. The vulnerable area are the interior with little or no medical facilities, hence the dwellers are very susceptible to sicknesses and diseases. The framework uses mobile devices that can run LightWeight Agents (LWAs) to send consultation requests to a remote medical expert in urban city from the vulnerable interiors. The feedback is conveyed to the requester through the same medium. The system architecture which contained AgenRoller, LWAs, The front-end (mobile devices) and back-end (the medical server) is presented. The algorithm for the software component of the architecture (AgenRoller) is also presented. The system is modeled as M/M/1/c queuing system, and simulated using Simevents from MATLAB Simulink environment. The simulation result presented show the average queue length, the number of entities in the queue and the number of entities departure from the system. These together present the rate of information processing in the system. A full scale development of this system with proper implementation will help extend the few medical facilities available in the urban cities in developing nations to the interiors thereby reducing the number of casualties in the vulnerable areas of the developing world especially in Sub Saharan Africa.
High Speed Observation of Fragment Impact Initiation of Nitromethane Charges
NASA Astrophysics Data System (ADS)
Cook, M. D.; Briggs, R. I.; Haskins, P. J.; Stennett, C.
1999-06-01
Ultra high speed digital photography has been used to record the onset and build-up of reaction in nitromethane charges that have been impacted by steel fragments. The nitromethane charges were housed in perspex cylinders and back-lit using conventional flash bulbs. Flat plates of aluminium of varying thicknesses were glued to one end of the cylinder and perspex plates to the other. The completed charge was positioned to allow normal impact of the projectiles. The events were filmed using and Imacon 468, ultra high speed digital image system capable of framing at up to 100 million pictures per second, with a minimum interframe time of 10 nanoseconds, and exposure time of between 10ns and 1 millisecond. Using this system it was possible to record detailed photographic information concerning the onset and growth of reaction due to shock initiation of the nitromethane charges. The implications of these results for the ignition and growth process in nitromethane are discussed.
The atmospheric transparency of Telescope Array experiment from LIDAR
NASA Astrophysics Data System (ADS)
Tomida, T.
2011-09-01
UV fluorescence light generated by an air shower is scattered and lost along the path of transmission to the telescope. The main scattering processes are Rayleigh scattering by molecules and scattering by aerosols in an atmosphere. In the Telescope Array Experiment, we make use of LIDAR (LIght Detection And Ranging), which observes the back-scattered light of laser. The LIDAR system is operated before the beginning and after the end of an FD observation, twice a night. The typical transparency of aerosols on clear night is obtained two years observation from September, 2007. The extinction coefficient of aerosols (αAS) at ground level are 0.040-0.013+0.036 km-1. The dependence of typical aerosols on height above ground level (1450 m a.s.l.) can be express by two exponential components as following: αAS(h) = 0.021 exp(-h/0.2)+0.019 exp(-h/1.9). The atmospheric transparency measured with the LIDAR system in TA site is discussed in this paper.
Mobile cosmetics advisor: an imaging based mobile service
NASA Astrophysics Data System (ADS)
Bhatti, Nina; Baker, Harlyn; Chao, Hui; Clearwater, Scott; Harville, Mike; Jain, Jhilmil; Lyons, Nic; Marguier, Joanna; Schettino, John; Süsstrunk, Sabine
2010-01-01
Selecting cosmetics requires visual information and often benefits from the assessments of a cosmetics expert. In this paper we present a unique mobile imaging application that enables women to use their cell phones to get immediate expert advice when selecting personal cosmetic products. We derive the visual information from analysis of camera phone images, and provide the judgment of the cosmetics specialist through use of an expert system. The result is a new paradigm for mobile interactions-image-based information services exploiting the ubiquity of camera phones. The application is designed to work with any handset over any cellular carrier using commonly available MMS and SMS features. Targeted at the unsophisticated consumer, it must be quick and easy to use, not requiring download capabilities or preplanning. Thus, all application processing occurs in the back-end system and not on the handset itself. We present the imaging pipeline technology and a comparison of the services' accuracy with respect to human experts.
The PALM-3000 high-order adaptive optics system for Palomar Observatory
NASA Astrophysics Data System (ADS)
Bouchez, Antonin H.; Dekany, Richard G.; Angione, John R.; Baranec, Christoph; Britton, Matthew C.; Bui, Khanh; Burruss, Rick S.; Cromer, John L.; Guiwits, Stephen R.; Henning, John R.; Hickey, Jeff; McKenna, Daniel L.; Moore, Anna M.; Roberts, Jennifer E.; Trinh, Thang Q.; Troy, Mitchell; Truong, Tuan N.; Velur, Viswa
2008-07-01
Deployed as a multi-user shared facility on the 5.1 meter Hale Telescope at Palomar Observatory, the PALM-3000 highorder upgrade to the successful Palomar Adaptive Optics System will deliver extreme AO correction in the near-infrared, and diffraction-limited images down to visible wavelengths, using both natural and sodium laser guide stars. Wavefront control will be provided by two deformable mirrors, a 3368 active actuator woofer and 349 active actuator tweeter, controlled at up to 3 kHz using an innovative wavefront processor based on a cluster of 17 graphics processing units. A Shack-Hartmann wavefront sensor with selectable pupil sampling will provide high-order wavefront sensing, while an infrared tip/tilt sensor and visible truth wavefront sensor will provide low-order LGS control. Four back-end instruments are planned at first light: the PHARO near-infrared camera/spectrograph, the SWIFT visible light integral field spectrograph, Project 1640, a near-infrared coronagraphic integral field spectrograph, and 888Cam, a high-resolution visible light imager.
Jamil, Majid; Sharma, Sanjeev Kumar; Singh, Rajveer
2015-01-01
This paper focuses on the detection and classification of the faults on electrical power transmission line using artificial neural networks. The three phase currents and voltages of one end are taken as inputs in the proposed scheme. The feed forward neural network along with back propagation algorithm has been employed for detection and classification of the fault for analysis of each of the three phases involved in the process. A detailed analysis with varying number of hidden layers has been performed to validate the choice of the neural network. The simulation results concluded that the present method based on the neural network is efficient in detecting and classifying the faults on transmission lines with satisfactory performances. The different faults are simulated with different parameters to check the versatility of the method. The proposed method can be extended to the Distribution network of the Power System. The various simulations and analysis of signals is done in the MATLAB(®) environment.
Electrode configuration for extreme-UV electrical discharge source
Spence, Paul Andrew; Fornaciari, Neal Robert; Chang, Jim Jihchyun
2002-01-01
It has been demonstrated that debris generation within an electric capillary discharge source, for generating extreme ultraviolet and soft x-ray, is dependent on the magnitude and profile of the electric field that is established along the surfaces of the electrodes. An electrode shape that results in uniform electric field strength along its surface has been developed to minimize sputtering and debris generation. The electric discharge plasma source includes: (a) a body that defines a circular capillary bore that has a proximal end and a distal end; (b) a back electrode positioned around and adjacent to the distal end of the capillary bore wherein the back electrode has a channel that is in communication with the distal end and that is defined by a non-uniform inner surface which exhibits a first region which is convex, a second region which is concave, and a third region which is convex wherein the regions are viewed outwardly from the inner surface of the channel that is adjacent the distal end of the capillary bore so that the first region is closest to the distal end; (c) a front electrode positioned around and adjacent to the proximal end of the capillary bore wherein the front electrode has an opening that is communication with the proximal end and that is defined by a non-uniform inner surface which exhibits a first region which is convex, a second region which is substantially linear, and third region which is convex wherein the regions are viewed outwardly from the inner surface of the opening that is adjacent the proximal end of the capillary bore so that the first region is closest to the proximal end; and (d) a source of electric potential that is connected across the front and back electrodes.
Geometry-based across wafer process control in a dual damascene scenario
NASA Astrophysics Data System (ADS)
Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Thrun, Xaver
2018-03-01
Dual damascene is an established patterning process for back-end-of-line to generate copper interconnects and lines. One of the critical output parameters is the electrical resistance of the metal lines. In our 200 mm line, this is currently being controlled by a feed-forward control from the etch process to the final step in the CMP process. In this paper, we investigate the impact of alternative feed-forward control using a calibrated physical model that estimates the impact on electrical resistance of the metal lines* . This is done by simulation on a large set of wafers. Three different approaches are evaluated, one of which uses different feed-forward settings for different radial zones in the CMP process.
A Case for Application Oblivious Energy-Efficient MPI Runtime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkatesh, Akshay; Vishnu, Abhinav; Hamidouche, Khaled
Power has become the major impediment in designing large scale high-end systems. Message Passing Interface (MPI) is the {\\em de facto} communication interface used as the back-end for designing applications, programming models and runtime for these systems. Slack --- the time spent by an MPI process in a single MPI call --- provides a potential for energy and power savings, if an appropriate power reduction technique such as core-idling/Dynamic Voltage and Frequency Scaling (DVFS) can be applied without perturbing application's execution time. Existing techniques that exploit slack for power savings assume that application behavior repeats across iterations/executions. However, an increasingmore » use of adaptive, data-dependent workloads combined with system factors (OS noise, congestion) makes this assumption invalid. This paper proposes and implements Energy Aware MPI (EAM) --- an application-oblivious energy-efficient MPI runtime. EAM uses a combination of communication models of common MPI primitives (point-to-point, collective, progress, blocking/non-blocking) and an online observation of slack for maximizing energy efficiency. Each power lever incurs time overhead, which must be amortized over slack to minimize degradation. When predicted communication time exceeds a lever overhead, the lever is used {\\em as soon as possible} --- to maximize energy efficiency. When mis-prediction occurs, the lever(s) are used automatically at specific intervals for amortization. We implement EAM using MVAPICH2 and evaluate it on ten applications using up to 4096 processes. Our performance evaluation on an InfiniBand cluster indicates that EAM can reduce energy consumption by 5--41\\% in comparison to the default approach, with negligible (less than 4\\% in all cases) performance loss.« less
Portable appliance security apparatus
NASA Technical Reports Server (NTRS)
Kerley, J. J. (Inventor)
1981-01-01
An apparatus for securing a small computer, or other portable appliance, against theft is described. It is comprised of a case having an open back through which the computer is installed or removed. Guide members in the form of slots are formed in a rear portion of opposite walls of the case for receiving a back plate to cover the opening and thereby secure the computer within the case. An opening formed in the top wall of the case exposes the keyboard and display of the computer. The back plate is locked in the closed position by a key-operated plug type lock. The lock is attached to one end of a hold down cable, the opposite end thereof being secured to a desk top or other stationary object. Thus, the lock simultaneously secures the back plate to the case and retains the case to the stationary object.
Budak, Umit; Şengür, Abdulkadir; Guo, Yanhui; Akbulut, Yaman
2017-12-01
Microaneurysms (MAs) are known as early signs of diabetic-retinopathy which are called red lesions in color fundus images. Detection of MAs in fundus images needs highly skilled physicians or eye angiography. Eye angiography is an invasive and expensive procedure. Therefore, an automatic detection system to identify the MAs locations in fundus images is in demand. In this paper, we proposed a system to detect the MAs in colored fundus images. The proposed method composed of three stages. In the first stage, a series of pre-processing steps are used to make the input images more convenient for MAs detection. To this end, green channel decomposition, Gaussian filtering, median filtering, back ground determination, and subtraction operations are applied to input colored fundus images. After pre-processing, a candidate MAs extraction procedure is applied to detect potential regions. A five-stepped procedure is adopted to get the potential MA locations. Finally, deep convolutional neural network (DCNN) with reinforcement sample learning strategy is used to train the proposed system. The DCNN is trained with color image patches which are collected from ground-truth MA locations and non-MA locations. We conducted extensive experiments on ROC dataset to evaluate of our proposal. The results are encouraging.
Masadome, Takashi; Miyanishi, Takaaki; Watanabe, Keita; Ueda, Hiroshi; Hattori, Toshiaki
2011-01-01
A solution of polyhexamethylene biguanide hydrochloride (PHMB-HCl) was titrated with a standard solution of potassium poly(vinyl sulfate) (PVSK) using crystal violet (CV) as an photometric indicator cation. The end point was detected by a sharp absorbance change due to an abrupt decrease in the concentration of CV. A linear relationship between the concentration of PHMB-HCl and the end-point volume of the titrant existed in the concentration range from 2 to 10 × 10(-6) eq mol L(-1). Back-titration was based on adding an excess amount of PVSK to a sample solution containing CV, which was titrated with a standard solution of poly(diallyldimethylammonium chloride) (PDADMAC). The calibration curve of the PHMB-HCl concentration to the end point volume of the titrant was also linear in the concentration range from 2 to 8 × 10(-6) eq mol L(-1). Both photometric titrations were applied to the determination of PHMB-HCl in a few contact-lens detergents. Back-titration showed a clear end point, but direct titration showed an unclear end point. The results of the back-titration of PHMB-HCl were compared with the content registered in its labels. 2011 © The Japan Society for Analytical Chemistry
FinFET and UTBB for RF SOI communication systems
NASA Astrophysics Data System (ADS)
Raskin, Jean-Pierre
2016-11-01
Performance of RF integrated circuit (IC) is directly linked to the analog and high frequency characteristics of the transistors, the quality of the back-end of line process as well as the electromagnetic properties of the substrate. Thanks to the introduction of the trap-rich high-resistivity Silicon-on-Insulator (SOI) substrate on the market, the ICs requirements in term of linearity are fulfilled. Today partially depleted SOI MOSFET is the mainstream technology for RF SOI systems. Future generations of mobile communication systems will require transistors with better high frequency performance at lower power consumption. The advanced MOS transistors in competition are FinFET and Ultra Thin Body and Buried oxide (UTBB) SOI MOSFETs. Both devices have been intensively studied these last years. Most of the reported data concern their digital performance. In this paper, their analog/RF behavior is described and compared. Both show similar characteristics in terms of transconductance, Early voltage, voltage gain, self-heating issue but UTBB outperforms FinFET in terms of cutoff frequencies thanks to their relatively lower fringing parasitic capacitances.
Toward a reduced-wire readout system for ultrasound imaging.
Lim, Jaemyung; Arkan, Evren F; Degertekin, F Levent; Ghovanloo, Maysam
2014-01-01
We present a system-on-a-chip (SoC) for use in high-frequency capacitive micromachined ultrasonic transducer (CMUT) imaging systems. This SoC consists of trans-impedance amplifiers (TIA), delay locked loop (DLL) based clock multiplier, quadrature sampler, and pulse width modulator (PWM). The SoC down converts RF echo signal to baseband by quadrature sampling which facilitates modulation. To send data through a 1.6 m wire in the catheter which has limited bandwidth and is vulnerable to noise, the SoC creates a pseudo-digital PWM signal which can be used for back telemetry or wireless readout of the RF data. In this implementation, using a 0.35-μm std. CMOS process, the TIA and single-to-differential (STD) converter had 45 MHz bandwidth, the quadrature sampler had 10.1 dB conversion gain, and the PWM had 5-bit ENoB. Preliminary results verified front-end functionality, and the power consumption of a TIA, STD, quadrature sampler, PWM, and clock multiplier was 26 mW from a 3 V supply.
Toward a Reduced-Wire Readout System for Ultrasound Imaging
Lim, Jaemyung; Arkan, Evren F.; Degertekin, F. Levent; Ghovanloo, Maysam
2015-01-01
We present a system-on-a-chip (SoC) for use in high-frequency capacitive micromachined ultrasonic transducer (CMUT) imaging systems. This SoC consists of trans-impedance amplifiers (TIA), delay locked loop (DLL) based clock multiplier, quadrature sampler, and pulse width modulator (PWM). The SoC down converts RF echo signal to baseband by quadrature sampling which facilitates modulation. To send data through a 1.6 m wire in the catheter which has limited bandwidth and is vulnerable to noise, the SoC creates a pseudo-digital PWM signal which can be used for back telemetry or wireless readout of the RF data. In this implementation, using a 0.35-μm std. CMOS process, the TIA and single-to-differential (STD) converter had 45 MHz bandwidth, the quadrature sampler had 10.1 dB conversion gain, and the PWM had 5-bit ENoB. Preliminary results verified front-end functionality, and the power consumption of a TIA, STD, quadrature sampler, PWM, and clock multiplier was 26 mW from a 3 V supply. PMID:25571135
SHEATHED TUBE AND APPARATUS AND METHOD OF PRODUCTION THEREOF
Ohlinger, L.A.
1959-08-18
A tubular fuel element covered inside and out by a unitary covering tube originally about twice its length and of small enough diameter to fit snugly inside the fuel tube is described. The covering tube is then reentrantly folded back by a pressure-die mechanism over both ends of the fuel tube and againsts outside until the folded back ends of the covering tube meet where they are welded in a single seam running circumferentially around the middle of the resulting assembly.
Wistbacka, Greta; Andrade, Pedro Amarante; Simberg, Susanna; Hammarberg, Britta; Södersten, Maria; Švec, Jan G; Granqvist, Svante
2018-01-01
Resonance tube phonation with tube end in water is a voice therapy method in which the patient phonates through a glass tube, keeping the free end of the tube submerged in water, creating bubbles. The purpose of this experimental study was to determine flow-pressure relationship, flow thresholds between bubble types, and bubble frequency as a function of flow and back volume. A flow-driven vocal tract simulator was used for recording the back pressure produced by resonance tubes with inner diameters of 8 and 9 mm submerged at water depths of 0-7 cm. Visual inspection of bubble types through video recording was also performed. The static back pressure was largely determined by the water depth. The narrower tube provided a slightly higher back pressure for a given flow and depth. The amplitude of the pressure oscillations increased with flow and depth. Depending on flow, the bubbles were emitted from the tube in three distinct types with increasing flow: one by one, pairwise, and in a chaotic manner. The bubble frequency was slightly higher for the narrower tube. An increase in back volume led to a decrease in bubble frequency. This study provides data on the physical properties of resonance tube phonation with the tube end in water. This information will be useful in future research when looking into the possible effects of this type of voice training. Copyright © 2018 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Application of Peterson's stray light model to complex optical instruments
NASA Astrophysics Data System (ADS)
Fray, S.; Goepel, M.; Kroneberger, M.
2016-07-01
Gary L. Peterson (Breault Research Organization) presented a simple analytical model for in- field stray light evaluation of axial optical systems. We exploited this idea for more complex optical instruments of the Meteosat Third Generation (MTG) mission. For the Flexible Combined Imager (FCI) we evaluated the in-field stray light of its three-mirroranastigmat telescope, while for the Infrared Sounder (IRS) we performed an end-to-end analysis including the front telescope, interferometer and back telescope assembly and the cold optics. A comparison to simulations will be presented. The authors acknowledge the support by ESA and Thales Alenia Space through the MTG satellites program.
Program Helps Simulate Neural Networks
NASA Technical Reports Server (NTRS)
Villarreal, James; Mcintire, Gary
1993-01-01
Neural Network Environment on Transputer System (NNETS) computer program provides users high degree of flexibility in creating and manipulating wide variety of neural-network topologies at processing speeds not found in conventional computing environments. Supports back-propagation and back-propagation-related algorithms. Back-propagation algorithm used is implementation of Rumelhart's generalized delta rule. NNETS developed on INMOS Transputer(R). Predefines back-propagation network, Jordan network, and reinforcement network to assist users in learning and defining own networks. Also enables users to configure other neural-network paradigms from NNETS basic architecture. Small portion of software written in OCCAM(R) language.
NASA Astrophysics Data System (ADS)
Koweek, David A.; Dunbar, Robert B.; Monismith, Stephen G.; Mucciarone, David A.; Woodson, C. Brock; Samuel, Lianna
2015-09-01
Shallow back reefs commonly experience greater thermal and biogeochemical variability owing to a combination of coral community metabolism, environmental forcing, flow regime, and water depth. We present results from a high-resolution (sub-hourly to sub-daily) hydrodynamic and biogeochemical study, along with a coupled long-term (several months) hydrodynamic study, conducted on the back reefs of Ofu, American Samoa. During the high-resolution study, mean temperature was 29.0 °C with maximum temperatures near 32 °C. Dissolved oxygen concentrations spanned 32-178 % saturation, and pHT spanned the range from 7.80 to 8.39 with diel ranges reaching 0.58 units. Empirical cumulative distribution functions reveal that pHT was between 8.0 and 8.2 during only 30 % of the observational period, with approximately even distribution of the remaining 70 % of the time between pHT values less than 8.0 and greater than 8.2. Thermal and biogeochemical variability in the back reefs is partially controlled by tidal modulation of wave-driven flow, which isolates the back reefs at low tide and brings offshore water into the back reefs at high tide. The ratio of net community calcification to net community production was 0.15 ± 0.01, indicating that metabolism on the back reef was dominated by primary production and respiration. Similar to other back reef systems, the back reefs of Ofu are carbon sinks during the daytime. Shallow back reefs like those in Ofu may provide insights for how coral communities respond to extreme temperatures and acidification and are deserving of continued attention.
Nuclear reactor fuel assembly duct-tube-to-handling-socket attachment system
Christiansen, David W.; Smith, Bob G.
1982-01-01
A reusable system for removably attaching the upper end 10of a nuclear reactor duct tube to the lower end 30 of a nuclear reactor fuel assembly handling socket. A transition ring 20, fixed to the duct tube's upper end 10, has an interior-threaded section 22 with a first locking hole segment 24. An adaptor ring 40, fixed to the handling socket's lower end 30 has an outside-threaded section 42 with a second locking hole segment 44. The inside 22 and outside 42 threaded sections match and can be joined so that the first 24 and second 44 locking hole segments can be aligned to form a locking hole. A locking ring 50, with a locking pin 52, slides over the adaptor ring 40 so that the locking pin 52 fits in the locking hole. A swage lock 60 or a cantilever finger lock 70 is formed from the locking cup collar 26 to fit in a matching groove 54 or 56 in the locking ring 50 to prevent the locking ring's locking pin 52 from backing out of the locking hole.
A mobile care system with alert mechanism.
Lee, Ren-Guey; Chen, Kuei-Chien; Hsiao, Chun-Chieh; Tseng, Chwan-Lu
2007-09-01
Hypertension and arrhythmia are chronic diseases, which can be effectively prevented and controlled only if the physiological parameters of the patient are constantly monitored, along with the full support of the health education and professional medical care. In this paper, a role-based intelligent mobile care system with alert mechanism in chronic care environment is proposed and implemented. The roles in our system include patients, physicians, nurses, and healthcare providers. Each of the roles represents a person that uses a mobile device such as a mobile phone to communicate with the server setup in the care center such that he or she can go around without restrictions. For commercial mobile phones with Bluetooth communication capability attached to chronic patients, we have developed physiological signal recognition algorithms that were implemented and built-in in the mobile phone without affecting its original communication functions. It is thus possible to integrate several front-end mobile care devices with Bluetooth communication capability to extract patients' various physiological parameters [such as blood pressure, pulse, saturation of haemoglobin (SpO2), and electrocardiogram (ECG)], to monitor multiple physiological signals without space limit, and to upload important or abnormal physiological information to healthcare center for storage and analysis or transmit the information to physicians and healthcare providers for further processing. Thus, the physiological signal extraction devices only have to deal with signal extraction and wireless transmission. Since they do not have to do signal processing, their form factor can be further reduced to reach the goal of microminiaturization and power saving. An alert management mechanism has been included in back-end healthcare center to initiate various strategies for automatic emergency alerts after receiving emergency messages or after automatically recognizing emergency messages. Within the time intervals in system setting, according to the medical history of a specific patient, our prototype system can inform various healthcare providers in sequence to provide healthcare service with their reply to ensure the accuracy of alert information and the completeness of early warning notification to further improve the healthcare quality. In the end, with the testing results and performance evaluation of our implemented system prototype, we conclude that it is possible to set up a complete intelligent healt care chain with mobile monitoring and healthcare service via the assistance of our system.
CephFS: a new generation storage platform for Australian high energy physics
NASA Astrophysics Data System (ADS)
Borges, G.; Crosby, S.; Boland, L.
2017-10-01
This paper presents an implementation of a Ceph file system (CephFS) use case at the ARC Center of Excellence for Particle Physics at the Terascale (CoEPP). CoEPP’s CephFS provides a posix-like file system on top of a Ceph RADOS object store, deployed on commodity hardware and without single points of failure. By delivering a unique file system namespace at different CoEPP centres spread across Australia, local HEP researchers can store, process and share data independently of their geographical locations. CephFS is also used as the back-end file system for a WLCG ATLAS user area at the Australian Tier-2. Dedicated SRM and XROOTD services, deployed on top of CoEPP’s CephFS, integrates it in ATLAS data distributed operations. This setup, while allowing Australian HEP researchers to trigger data movement via ATLAS grid tools, also enables local posix-like read access providing greater control to scientists of their data flows. In this article we will present details on CoEPP’s Ceph/CephFS implementation and report performance I/O metrics collected during the testing/tuning phase of the system.
AIRSAR Automated Web-based Data Processing and Distribution System
NASA Technical Reports Server (NTRS)
Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen
2005-01-01
In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.
NASA Astrophysics Data System (ADS)
Pilone, D.; Cechini, M. F.; Mitchell, A.
2011-12-01
Earth Science applications typically deal with large amounts of data and high throughput rates, if not also high transaction rates. While Open Source is frequently used for smaller scientific applications, large scale, highly available systems frequently fall back to "enterprise" class solutions like Oracle RAC or commercial grade JEE Application Servers. NASA's Earth Observing System Data and Information System (EOSDIS) provides end-to-end capabilities for managing NASA's Earth science data from multiple sources - satellites, aircraft, field measurements, and various other programs. A core capability of EOSDIS, the Earth Observing System (EOS) Clearinghouse (ECHO), is a highly available search and order clearinghouse of over 100 million pieces of science data that has evolved from its early R&D days to a fully operational system. Over the course of this maturity ECHO has largely transitioned from commercial frameworks, databases, and operating systems to Open Source solutions...and in some cases, back. In this talk we discuss the progression of our technological solutions and our lessons learned in the areas of: ? High performance, large scale searching solutions ? GeoSpatial search capabilities and dealing with multiple coordinate systems ? Search and storage of variable format source (science) data ? Highly available deployment solutions ? Scalable (elastic) solutions to visual searching and image handling Throughout the evolution of the ECHO system we have had to evaluate solutions with respect to performance, cost, developer productivity, reliability, and maintainability in the context of supporting global science users. Open Source solutions have played a significant role in our architecture and development but several critical commercial components remain (or have been reinserted) to meet our operational demands.
A novel flux-switching permanent magnet machine with v-shaped magnets
NASA Astrophysics Data System (ADS)
Zhao, Guishu; Hua, Wei
2017-05-01
In this paper, firstly a novel 6-stator-coil/17-rotor-pole (6/17) flux-switching permanent magnet (FSPM) machine with V-shaped magnets, deduced from conventional 12/17 FSPM machines is proposed to achieve more symmetrical phase back-electromotive force (back-EMF), and smaller torque ripple by comparing with an existing 6/10 V-shaped FSPM machine. Then, to obtain larger electromagnetic torque, less torque ripple, and easier mechanical processing, two improved variants based on the original 6/17 V-shaped topology are proposed. For the first variant, the separate stator-core segments located on the stator yoke are connected into a united stator yoke, while for the second variant the stator core is a whole entity by adding magnetic bridges at the ends of permanent magnets (PMs). Consequently, the performances of the three 6/17 V-shaped FSPM machines, namely, the original one and the two variants, are conducted by finite element analysis (FEA). The results reveal that the first variant exhibits significantly larger torque and considerably improved torque per magnet volume, i.e., the magnet utilization ratio than the original one, and the second variant exhibits the smallest torque ripple, least total harmonic distribution (THD) of phase back-EMF, and easiest mechanical processing for manufacturing.
1999-06-19
In the Space Station Processing Facility, STS-99 crew members inspect the Shuttle Radar Topography Mission (SRTM), the payload for their mission. At left is Commander Kevin R. Kregel talking to Mission Specialist Janice Voss (Ph.D.); and Mission Specialists Gerhard Thiele of Germany and Mamoru Mohri of Japan farther back. In the foreground (back to camera) is Mission Specialist Janet Lynn Kavandi (Ph.D.). The final crew member (not shown) is Pilot Dominic L. Pudwill Gorie. Thiele represents the European Space Agency and Mohri represents the National Space Agency of Japan. An international project spearheaded by the National Imagery and Mapping Agency and NASA, with participation of the German Aerospace Center DLR, the SRTM consists of a specially modified radar system that will gather data for the most accurate and complete topographic map of the Earth's surface that has ever been assembled. SRTM will make use of radar interferometry, wherein two radar images are taken from slightly different locations. Differences between these images allow for the calculation of surface elevation, or change. The SRTM hardware will consist of one radar antenna in the shuttle payload bay and a second radar antenna attached to the end of a mast extended 60 meters (195 feet) out from the shuttle. STS-99 is scheduled to launch Sept. 16 at 8:47 a.m. from Launch Pad 39A
LSST camera readout chip ASPIC: test tools
NASA Astrophysics Data System (ADS)
Antilogus, P.; Bailly, Ph; Jeglot, J.; Juramy, C.; Lebbolo, H.; Martin, D.; Moniez, M.; Tocut, V.; Wicek, F.
2012-02-01
The LSST camera will have more than 3000 video-processing channels. The readout of this large focal plane requires a very compact readout chain. The correlated ''Double Sampling technique'', which is generally used for the signal readout of CCDs, is also adopted for this application and implemented with the so called ''Dual Slope integrator'' method. We have designed and implemented an ASIC for LSST: the Analog Signal Processing asIC (ASPIC). The goal is to amplify the signal close to the output, in order to maximize signal to noise ratio, and to send differential outputs to the digitization. Others requirements are that each chip should process the output of half a CCD, that is 8 channels and should operate at 173 K. A specific Back End board has been designed especially for lab test purposes. It manages the clock signals, digitizes the analog differentials outputs of ASPIC and stores data into a memory. It contains 8 ADCs (18 bits), 512 kwords memory and an USB interface. An FPGA manages all signals from/to all components on board and generates the timing sequence for ASPIC. Its firmware is written in Verilog and VHDL languages. Internals registers permit to define various tests parameters of the ASPIC. A Labview GUI allows to load or update these registers and to check a proper operation. Several series of tests, including linearity, noise and crosstalk, have been performed over the past year to characterize the ASPIC at room and cold temperature. At present, the ASPIC, Back-End board and CCD detectors are being integrated to perform a characterization of the whole readout chain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Conlan
Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less
NASA Technical Reports Server (NTRS)
Laufer, Alexander (Editor); Post, Todd (Editor); Brady, Jody Lannen (Editor)
2004-01-01
The contents include the following: The Journey Back;What GOES Around, Comes Around;Marbles for the Imagination;End-to-End Commitment;Fly Away; Going Up; Say What You Mean; ASK Talks with William Ready.
Assessment of the effectiveness of head only and back-of-the-head electrical stunning of chickens
Gibson, T. J.; Taylor, A. H.; Gregory, N. G.
2016-01-01
Abstract The study assesses the effectiveness of reversible head-only and back-of-the-head electrical stunning of chickens using 130–950 mA per bird at 50 Hz AC.Three trials were conducted to compare both stunning systems: (a) behavioural assessment of return of consciousness, (b) insensibility to thermal pain, and (c) assessment of return of brain activity with visually evoked potentials (VEPs).Assessment of behaviour suggested that the period of unconsciousness following head-only electrical stunning was shorter in hens compared to broilers.Stunning across the back-of-the-head delayed the time to return of brainstem function compared to stunning with standard head-only electrodes. Additionally, back-of-the-head stunning produced a more prolonged period of electroanalgesia compared to head-only.Based on examination of return of brain function with VEPs in hens, back-of-the-head stunning produced a shorter-lasting stun than standard head-only. However, even for standard head-only, the stun was notably shorter than previously reported. In some birds, brain function had returned within 9 s after the end of stunning.The results suggest that some birds may recover consciousness prior to or during the neck cut. Based on these findings, back-of-the-head stunning and standard head-only stunning of hens should not be recommended without further development. PMID:27023411
Framework for a clinical information system.
Van De Velde, R; Lansiers, R; Antonissen, G
2002-01-01
The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.
A real-time coherent dedispersion pipeline for the giant metrewave radio telescope
NASA Astrophysics Data System (ADS)
De, Kishalay; Gupta, Yashwant
2016-02-01
A fully real-time coherent dedispersion system has been developed for the pulsar back-end at the Giant Metrewave Radio Telescope (GMRT). The dedispersion pipeline uses the single phased array voltage beam produced by the existing GMRT software back-end (GSB) to produce coherently dedispersed intensity output in real time, for the currently operational bandwidths of 16 MHz and 32 MHz. Provision has also been made to coherently dedisperse voltage beam data from observations recorded on disk. We discuss the design and implementation of the real-time coherent dedispersion system, describing the steps carried out to optimise the performance of the pipeline. Presently functioning on an Intel Xeon X5550 CPU equipped with a NVIDIA Tesla C2075 GPU, the pipeline allows dispersion free, high time resolution data to be obtained in real-time. We illustrate the significant improvements over the existing incoherent dedispersion system at the GMRT, and present some preliminary results obtained from studies of pulsars using this system, demonstrating its potential as a useful tool for low frequency pulsar observations. We describe the salient features of our implementation, comparing it with other recently developed real-time coherent dedispersion systems. This implementation of a real-time coherent dedispersion pipeline for a large, low frequency array instrument like the GMRT, will enable long-term observing programs using coherent dedispersion to be carried out routinely at the observatory. We also outline the possible improvements for such a pipeline, including prospects for the upgraded GMRT which will have bandwidths about ten times larger than at present.
Optical Data Processing for Missile Guidance.
1984-11-21
and architectures for back -substitution and the solution of triangular systems of LAEs (linear algebraic equations). Most recently, a parallel QR...Calculation of I1 is quite difficult since the o T exact Z matrix is quite ill-conditioned. The two VC choices considered in our system are E - I and E I - 0...shown in fig. 1. It These operations are most commonly referred to as shows the ship in water with a sky and shoreline back - segmentation and also
NASA Technical Reports Server (NTRS)
Garin, John; Matteo, Joseph; Jennings, Von Ayre
1988-01-01
The capability for a single operator to simultaneously control complex remote multi degree of freedom robotic arms and associated dextrous end effectors is being developed. An optimal solution within the realm of current technology, can be achieved by recognizing that: (1) machines/computer systems are more effective than humans when the task is routine and specified, and (2) humans process complex data sets and deal with the unpredictable better than machines. These observations lead naturally to a philosophy in which the human's role becomes a higher level function associated with planning, teaching, initiating, monitoring, and intervening when the machine gets into trouble, while the machine performs the codifiable tasks with deliberate efficiency. This concept forms the basis for the integration of man and telerobotics, i.e., robotics with the operator in the control loop. The concept of integration of the human in the loop and maximizing the feed-forward and feed-back data flow is referred to as telepresence.
Iterative simulated quenching for designing irregular-spot-array generators.
Gillet, J N; Sheng, Y
2000-07-10
We propose a novel, to our knowledge, algorithm of iterative simulated quenching with temperature rescaling for designing diffractive optical elements, based on an analogy between simulated annealing and statistical thermodynamics. The temperature is iteratively rescaled at the end of each quenching process according to ensemble statistics to bring the system back from a frozen imperfect state with a local minimum of energy to a dynamic state in a Boltzmann heat bath in thermal equilibrium at the rescaled temperature. The new algorithm achieves much lower cost function and reconstruction error and higher diffraction efficiency than conventional simulated annealing with a fast exponential cooling schedule and is easy to program. The algorithm is used to design binary-phase generators of large irregular spot arrays. The diffractive phase elements have trapezoidal apertures of varying heights, which fit ideal arbitrary-shaped apertures better than do trapezoidal apertures of fixed heights.
Web-Based Visual Analytics for Social Media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bruce, Joseph R.; Dowson, Scott T.
Social media provides a rich source of data that reflects current trends and public opinion on a multitude of topics. The data can be harvested from Twitter, Facebook, Blogs, and other social applications. The high rate of adoption of social media has created a domain that has an ever expanding volume of data that make it difficult to use the raw data for analysis. Information visual analytics is key in drawing out features of interest in social media. The Scalable Reasoning System is an application that couples a back end server performing analysis algorithms and an intuitive front end visualizationmore » to allow for investigation. We provide a componentized system that can be rapidly adapted to customer needs such that the information they are most interested in is brought to their attention through the application. To this end, we have developed a social media application for use by emergency operations for the city of Seattle to show current weather and traffic trends which is important for their tasks.« less
NASA Astrophysics Data System (ADS)
Clarke, David W.; Boyle, John F.; Chiverrell, Richard C.; Lario, Javier; Plater, Andrew J.
2014-09-01
At present, limited understanding of mesoscale (years-decades-centuries) back-barrier lagoon, barrier estuary behaviour is a critical shortcoming for resource managers and decision makers. In this paper, high-resolution particle size analysis of a sediment core from an intermittently open and closed barrier estuary is utilised to reconstruct a history of back-barrier environmental change at mesoscale temporal resolution. Sediments from Pescadero Marsh, California, were analysed for their particle size distribution at consecutive 2-mm intervals down-core. Site selection, informed by a time series of maps and aerial photographs coupled with a robust core chronology, ensured that the particle size data primarily reflect changing hydrodynamics of the back-barrier area over the European-American era (1850 to the present). Following more traditional plotting of particle size data and summary statistics, and statistical analysis of particle size end-members, visual analysis and categorisation of particle size distribution curves (PSDCs) provide an effective basis for the identification of recurring modal sizes and subpopulations. These particle size windows (PSWs) are interpreted as reflecting different modes of sediment transport and deposition, i.e., suspension and saltation loads, the varying prominence of which is interpreted as being modified by barrier integrity. When considered together, the down-core mean particle size (MPS) trend and individual PSDCs offer considerable insight into mesoscale system behaviour at subannual resolution over multiple years. This behaviour is expressed in the recurrence of characteristic barrier estuarine environments (closed lagoon, tidal lagoon, tidal marsh, and open estuary) and the overall barrier regime, and their persistence over the last c. 150 years. Subannual and multiannual fluctuations in back-barrier environmental configuration are seen to be superimposed on a longer-term quasi-stable barrier regime, demonstrating the value of the applied methodology with regard to bridging the estuarine evolution (long-term, stratigraphic) and process (short-term, geomorphic) knowledge bases. The documented behaviour suggests a level of innate morphological resilience in the system over the long term despite episodic disturbance by high-energy storms. Such empirical demonstrations of resilient behaviour in coastal environments are rare at the mesoscale.
Performance of the TGT liquid argon calorimeter and trigger system
NASA Astrophysics Data System (ADS)
Braunschweig, W.; Geulig, E.; Schöntag, M.; Siedling, R.; Wlochal, M.; Wotschack, J.; Cheplakov, A.; Feshchenko, A.; Kazarinov, M.; Kukhtin, V.; Ladygin, E.; Obudovskij, V.; Geweniger, C.; Hanke, P.; Kluge, E.-E.; Krause, J.; Putzer, A.; Rensch, B.; Schmidt, M.; Stenzel, H.; Tittel, K.; Wunsch, M.; Zerwas, D.; Ban, J.; Bruncko, D.; Jusko, A.; Kocper, B.; Aderholz, M.; Brettel, H.; Dulny, B.; Dydak, F.; Fent, J.; Huber, J.; Jakobs, K.; Oberlack, H.; Schacht, P.; Bogolyubsky, M. Y.; Chekulaev, S. V.; Kiryunin, A. E.; Kurchaninov, L. L.; Levitsky, M. S.; Maksimov, V. V.; Minaenko, A. A.; Moiseev, A. M.; Semenov, P. A.; Tikhonov, V. V.
1996-02-01
A novel concept of a liquid argon calorimeter, the "Thin Gap Turbine" (TGT) calorimeter, is presented. A TGT test module, equipped with specially developed cold front-end electronics in radiation hard GaAs technology, has been operated in a particle beam. Results on its performance are given. A 40 MHz FADC system with a "circular data store" and standalone readout and play-back capability has been developed to test the properties of the TGT detector for trigger purposes. Results on trigger efficiency, response and energy resolution are given.
Xu, Ping; Dong, Xiao-jun; Lu, Zhou-tong; Wang, Gongjun; Zhang, Han-qing; Chen, Xuan-ning; Li, Dong
2015-09-01
To evaluate the technique and the clinical effect of folding roof and rotary pushing in treatment of children with distal radius and ulna fracture of "back to back". From January 2012 to February 2014,38 children with distal radius and ulna fracture of "back to back" were treated by using the technique of folding roof and rotary pushing to reset and splint fixation including 23 males and 15 females with an average age of 9.5 years old ranging from 6 to 14 years old. Injury time was from 45 min to 3 days (averaged 1.3 days). All cases was unilateral closed fracture without symptoms of nerve injury occurred. The wrist joint anteroposterior and lateral radiographs showed double fracture of radius and ulna, and the broken end of radius was typical "back to back" displacement. The quality of reduction was assessed according to Dienst recommendation on the combination of Aro measurement, and the therapeutic effect was evaluated using standard of Anderson function. All patients were followed up from 3 to 13 months with an average of 6 months. There were no iatrogenic nerve injury. Thirty cases were treated successfully for the first time, 8 cases were again reset successfully; 28 cases were anatomical reduction, 7 cases were near anatomic reduction, 3 cases were functional reduction. At the second day 7 cases with hand and finger swelling appeared in multiple reset patients. Quality results of reduction were excellent in 33 cases, good in 5 cases. According to the standard of Anderson function evaluation, 35 cases were excellent, 3 cases were good. All fractures were healed with of deformity of wrist. Using the technique of folding roof and rotary pushing in treatment of children with distal radius and ulna fracture of "back to back" is very successful, the patient's limb function recovered well, the whole operation process is simple.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bracey, William; Bondre, Jayant; Shelton, Catherine
2013-07-01
The current inventory of used nuclear fuel assemblies (UNFAs) from commercial reactor operations in the United States totals approximately 65,000 metric tons or approximately 232,000 UNFAs primarily stored at the 104 operational reactors in the US and a small number of decommissioned reactors. This inventory is growing at a rate of roughly 2,000 to 2,400 metric tons each year, (Approx. 7,000 UNFAs) as a result of ongoing commercial reactor operations. Assuming an average of 10 metric tons per storage/transportation casks, this inventory of commercial UNFAs represents about 6,500 casks with an additional of about 220 casks every year. In Januarymore » 2010, the Blue Ribbon Commission (BRC) [1] was directed to conduct a comprehensive review of policies for managing the back end of the nuclear fuel cycle and recommend a new plan. The BRC issued their final recommendations in January 2012. One of the main recommendations is for the United States to proceed promptly to develop one or more consolidated storage facilities (CSF) as part of an integrated, comprehensive plan for safely managing the back end of the nuclear fuel cycle. Based on its extensive experience in storage and transportation cask design, analysis, licensing, fabrication, and operations including transportation logistics, Transnuclear, Inc. (TN), an AREVA Subsidiary within the Logistics Business Unit, is engineering an integrated system that will address the complete process of commercial UNFA management. The system will deal with UNFAs in their current storage mode in various configurations, the preparation including handling and additional packaging where required and transportation of UNFAs to a CSF site, and subsequent storage, operation and maintenance at the CSF with eventual transportation to a future repository or recycling site. It is essential to proceed by steps to ensure that the system will be the most efficient and serve at best its purpose by defining: the problem to be resolved, the criteria to evaluate the solutions, and the alternative solutions. The complexity of the project is increasing with time (more fuel assemblies, new storage systems, deteriorating logistics infrastructure at some sites, etc.) but with the uncertainty on the final disposal path, flexibility and simplicity will be critical. (authors)« less
A potent approach for the development of FPGA based DAQ system for HEP experiments
NASA Astrophysics Data System (ADS)
Khan, Shuaib Ahmad; Mitra, Jubin; David, Erno; Kiss, Tivadar; Nayak, Tapan Kumar
2017-10-01
With ever increasing particle beam energies and interaction rates in modern High Energy Physics (HEP) experiments in the present and future accelerator facilities, there has always been the demand for robust Data Acquisition (DAQ) schemes which perform in the harsh radiation environment and handle high data volume. The scheme is required to be flexible enough to adapt to the demands of future detector and electronics upgrades, and at the same time keeping the cost factor in mind. To address these challenges, in the present work, we discuss an efficient DAQ scheme for error resilient, high speed data communication on commercially available state-of-the-art FPGA with optical links. The scheme utilises GigaBit Transceiver (GBT) protocol to establish radiation tolerant communication link between on-detector front-end electronics situated in harsh radiation environment to the back-end Data Processing Unit (DPU) placed in a low radiation zone. The acquired data are reconstructed in DPU which reduces the data volume significantly, and then transmitted to the computing farms through high speed optical links using 10 Gigabit Ethernet (10GbE). In this study, we focus on implementation and testing of GBT protocol and 10GbE links on an Intel FPGA. Results of the measurements of resource utilisation, critical path delays, signal integrity, eye diagram and Bit Error Rate (BER) are presented, which are the indicators for efficient system performance.
Deuterated silicon nitride photonic devices for broadband optical frequency comb generation
NASA Astrophysics Data System (ADS)
Chiles, Jeff; Nader, Nima; Hickstein, Daniel D.; Yu, Su Peng; Briles, Travis Crain; Carlson, David; Jung, Hojoong; Shainline, Jeffrey M.; Diddams, Scott; Papp, Scott B.; Nam, Sae Woo; Mirin, Richard P.
2018-04-01
We report and characterize low-temperature, plasma-deposited deuterated silicon nitride thin films for nonlinear integrated photonics. With a peak processing temperature less than 300$^\\circ$C, it is back-end compatible with pre-processed CMOS substrates. We achieve microresonators with a quality factor of up to $1.6\\times 10^6 $ at 1552 nm, and $>1.2\\times 10^6$ throughout $\\lambda$ = 1510 -- 1600 nm, without annealing or stress management. We then demonstrate the immediate utility of this platform in nonlinear photonics by generating a 1 THz free spectral range, 900-nm-bandwidth modulation-instability microresonator Kerr comb and octave-spanning, supercontinuum-broadened spectra.
Theory and simulation of backbombardment in single-cell thermionic-cathode electron guns
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edelen, J. P.; Biedron, S. G.; Harris, J. R.
This paper presents a comparison between simulation results and a first principles analytical model of electron back-bombardment developed at Colorado State University for single-cell, thermionic-cathode rf guns. While most previous work on back-bombardment has been specific to particular accelerator systems, this work is generalized to a wide variety of guns within the applicable parameter space. The merits and limits of the analytic model will be discussed. This paper identifies the three fundamental parameters that drive the back-bombardment process, and demonstrates relative accuracy in calculating the predicted back-bombardment power of a single-cell thermionic gun.
Theory and simulation of backbombardment in single-cell thermionic-cathode electron guns
Edelen, J. P.; Biedron, S. G.; Harris, J. R.; ...
2015-04-01
This paper presents a comparison between simulation results and a first principles analytical model of electron back-bombardment developed at Colorado State University for single-cell, thermionic-cathode rf guns. While most previous work on back-bombardment has been specific to particular accelerator systems, this work is generalized to a wide variety of guns within the applicable parameter space. The merits and limits of the analytic model will be discussed. This paper identifies the three fundamental parameters that drive the back-bombardment process, and demonstrates relative accuracy in calculating the predicted back-bombardment power of a single-cell thermionic gun.
Patton, Gail Y.; Torgerson, Darrel D.
1987-01-01
An alignment reference device provides a collimated laser beam that minimizes angular deviations therein. A laser beam source outputs the beam into a single mode optical fiber. The output end of the optical fiber acts as a source of radiant energy and is positioned at the focal point of a lens system where the focal point is positioned within the lens. The output beam reflects off a mirror back to the lens that produces a collimated beam.
Cloud-Based Distributed Control of Unmanned Systems
2015-04-01
during mission execution. At best, the data is saved onto hard-drives and is accessible only by the local team. Data history in a form available and...following open source technologies: GeoServer, OpenLayers, PostgreSQL , and PostGIS are chosen to implement the back-end database and server. A brief...geospatial map data. 3. PostgreSQL : An SQL-compliant object-relational database that easily scales to accommodate large amounts of data - upwards to
Integrating end-to-end threads of control into object-oriented analysis and design
NASA Technical Reports Server (NTRS)
Mccandlish, Janet E.; Macdonald, James R.; Graves, Sara J.
1993-01-01
Current object-oriented analysis and design methodologies fall short in their use of mechanisms for identifying threads of control for the system being developed. The scenarios which typically describe a system are more global than looking at the individual objects and representing their behavior. Unlike conventional methodologies that use data flow and process-dependency diagrams, object-oriented methodologies do not provide a model for representing these global threads end-to-end. Tracing through threads of control is key to ensuring that a system is complete and timing constraints are addressed. The existence of multiple threads of control in a system necessitates a partitioning of the system into processes. This paper describes the application and representation of end-to-end threads of control to the object-oriented analysis and design process using object-oriented constructs. The issue of representation is viewed as a grouping problem, that is, how to group classes/objects at a higher level of abstraction so that the system may be viewed as a whole with both classes/objects and their associated dynamic behavior. Existing object-oriented development methodology techniques are extended by adding design-level constructs termed logical composite classes and process composite classes. Logical composite classes are design-level classes which group classes/objects both logically and by thread of control information. Process composite classes further refine the logical composite class groupings by using process partitioning criteria to produce optimum concurrent execution results. The goal of these design-level constructs is to ultimately provide the basis for a mechanism that can support the creation of process composite classes in an automated way. Using an automated mechanism makes it easier to partition a system into concurrently executing elements that can be run in parallel on multiple processors.
Periodic venting of MABR lumen allows high removal rates and high gas-transfer efficiencies.
Perez-Calleja, P; Aybar, M; Picioreanu, C; Esteban-Garcia, A L; Martin, K J; Nerenberg, R
2017-09-15
The membrane-aerated biofilm reactor (MABR) is a novel treatment technology that employs gas-supplying membranes to deliver oxygen directly to a biofilm growing on the membrane surface. When operated with closed-end membranes, the MABR provides 100-percent oxygen transfer efficiencies (OTE), resulting in significant energy savings. However, closed-end MABRs are more sensitive to back-diffusion of inert gases, such as nitrogen. Back-diffusion reduces the average oxygen transfer rates (OTR), consequently decreasing the average contaminant removal fluxes (J). We hypothesized that venting the membrane lumen periodically would increase the OTR and J. Using an experimental flow cell and mathematical modeling, we showed that back-diffusion gas profiles developed over relatively long timescales. Thus, very short ventings could re-establish uniform gas profiles for relatively long time periods. Using modeling, we systematically explored the effect of the venting interval (time between ventings). At moderate venting intervals, opening the membrane for 20 s every 30 min, the venting significantly increased the average OTR and J without substantially impacting the OTEs. When the interval was short enough, in this case shorter than 20 min, the OTR was actually higher than for continuous open-end operation. Our results show that periodic venting is a promising strategy to combine the advantages of open-end and closed end operation, maximizing both the OTR and OTE. Copyright © 2017 Elsevier Ltd. All rights reserved.
Comparing 2-nt 3' overhangs against blunt-ended siRNAs: a systems biology based study.
Ghosh, Preetam; Dullea, Robert; Fischer, James E; Turi, Tom G; Sarver, Ronald W; Zhang, Chaoyang; Basu, Kalyan; Das, Sajal K; Poland, Bradley W
2009-07-07
In this study, we formulate a computational reaction model following a chemical kinetic theory approach to predict the binding rate constant for the siRNA-RISC complex formation reaction. The model allowed us to study the potency difference between 2-nt 3' overhangs against blunt-ended siRNA molecules in an RNA interference (RNAi) system. The rate constant predicted by this model was fed into a stochastic simulation of the RNAi system (using the Gillespie stochastic simulator) to study the overall potency effect. We observed that the stochasticity in the transcription/translation machinery has no observable effects in the RNAi pathway. Sustained gene silencing using siRNAs can be achieved only if there is a way to replenish the dsRNA molecules in the cell. Initial findings show about 1.5 times more blunt-ended molecules will be required to keep the mRNA at the same reduced level compared to the 2-nt overhang siRNAs. However, the mRNA levels jump back to saturation after a longer time when blunt-ended siRNAs are used. We found that the siRNA-RISC complex formation reaction rate was 2 times slower when blunt-ended molecules were used pointing to the fact that the presence of the 2-nt overhangs has a greater effect on the reaction in which the bound RISC complex cleaves the mRNA.
Comparing 2-nt 3' overhangs against blunt-ended siRNAs: a systems biology based study
Ghosh, Preetam; Dullea, Robert; Fischer, James E; Turi, Tom G; Sarver, Ronald W; Zhang, Chaoyang; Basu, Kalyan; Das, Sajal K; Poland, Bradley W
2009-01-01
In this study, we formulate a computational reaction model following a chemical kinetic theory approach to predict the binding rate constant for the siRNA-RISC complex formation reaction. The model allowed us to study the potency difference between 2-nt 3' overhangs against blunt-ended siRNA molecules in an RNA interference (RNAi) system. The rate constant predicted by this model was fed into a stochastic simulation of the RNAi system (using the Gillespie stochastic simulator) to study the overall potency effect. We observed that the stochasticity in the transcription/translation machinery has no observable effects in the RNAi pathway. Sustained gene silencing using siRNAs can be achieved only if there is a way to replenish the dsRNA molecules in the cell. Initial findings show about 1.5 times more blunt-ended molecules will be required to keep the mRNA at the same reduced level compared to the 2-nt overhang siRNAs. However, the mRNA levels jump back to saturation after a longer time when blunt-ended siRNAs are used. We found that the siRNA-RISC complex formation reaction rate was 2 times slower when blunt-ended molecules were used pointing to the fact that the presence of the 2-nt overhangs has a greater effect on the reaction in which the bound RISC complex cleaves the mRNA. PMID:19594876
A PML-FDTD ALGORITHM FOR SIMULATING PLASMA-COVERED CAVITY-BACKED SLOT ANTENNAS. (R825225)
A three-dimensional frequency-dependent finite-difference time-domain (FDTD) algorithm with perfectly matched layer (PML) absorbing boundary condition (ABC) and recursive convolution approaches is developed to model plasma-covered open-ended waveguide or cavity-backed slot antenn...
Conceptual Design of the ITER ECE Diagnostic - An Update
NASA Astrophysics Data System (ADS)
Austin, M. E.; Pandya, H. K. B.; Beno, J.; Bryant, A. D.; Danani, S.; Ellis, R. F.; Feder, R.; Hubbard, A. E.; Kumar, S.; Ouroua, A.; Phillips, P. E.; Rowan, W. L.
2012-09-01
The ITER ECE diagnostic has recently been through a conceptual design review for the entire system including front end optics, transmission line, and back-end instruments. The basic design of two viewing lines, each with a single ellipsoidal mirror focussing into the plasma near the midplane of the typical operating scenarios is agreed upon. The location and design of the hot calibration source and the design of the shutter that directs its radiation to the transmission line are issues that need further investigation. In light of recent measurements and discussion, the design of the broadband transmission line is being revisited and new options contemplated. For the instruments, current systems for millimeter wave radiometers and broad-band spectrometers will be adequate for ITER, but the option for employing new state-of-the-art techniques will be left open.
Apparatus and method for detecting leaks in piping
Trapp, Donald J.
1994-01-01
A method and device for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe.
Bulk and integrated acousto-optic spectrometers for radio astronomy
NASA Technical Reports Server (NTRS)
Chin, G.; Buhl, D.; Florez, J. M.
1981-01-01
The development of sensitive heterodyne receivers (front end) in the centimeter and millimeter range, and the construction of sensitive RF spectrometers (back end) enable the spectral lines of interstellar molecules to be detected and identified. A technique was developed which combines acoustic bending of a collimated coherent light beam by a Bragg cell followed by detection by a sensitive array of photodetectors (thus forming an RF acousto-optic spectrometer (AOS). An AOS has wide bandwidth, large number of channels, and high resolution, and is compact, lightweight, and energy efficient. The thrust of receiver development is towards high frequency heterodyne systems, particularly in the millimeter, submillimeter, far infrared, and 10 micron spectral ranges.
A Reversible Light-Operated Nanovalve on Mesoporous Silica Nanoparticles
Tarn, Derrick; Ferris, Daniel P.; Barnes, Jonathan C.; Ambrogio, Michael W.; Stoddart, J. Fraser
2014-01-01
Two azobenzene α-cyclodextrin based nanovalves are designed, synthesized and assembled on mesoporous silica nanoparticles. When in aqueous conditions, the cyclodextrin cap is tightly bound to the azobenzene moiety and capable of holding back loaded cargo molecules. Upon irradiation with a near-UV light laser, trans to cis- photoisomerization of azobenzene initiates a dethreading process, which causes the cyclodextrin cap to unbind followed by the release of cargo. The addition of a bulky stopper group to the end of the stalk allows this design to be reversible; complete dethreading of cyclodextrin as a result of unbinding with azobenzene is prevented as a consequence of steric interference. As a result, thermal relaxation of cis- to trans-azobenzene allows for the rebinding of cyclodextrin and resealing of the nanopores, a process which entraps the remaining cargo. Two stalks were designed with different lengths and tested with alizarin red S and propidium iodide. No cargo release was observed prior to light irradiation, and the system was capable of multiuse. On / off control was also demonstrated by monitoring the release of cargo when the light stimulus was applied and removed, respectively. PMID:24519642
Ask-the-Expert: Active Learning Based Knowledge Discovery Using the Expert
NASA Technical Reports Server (NTRS)
Das, Kamalika
2017-01-01
Often the manual review of large data sets, either for purposes of labeling unlabeled instances or for classifying meaningful results from uninteresting (but statistically significant) ones is extremely resource intensive, especially in terms of subject matter expert (SME) time. Use of active learning has been shown to diminish this review time significantly. However, since active learning is an iterative process of learning a classifier based on a small number of SME-provided labels at each iteration, the lack of an enabling tool can hinder the process of adoption of these technologies in real-life, in spite of their labor-saving potential. In this demo we present ASK-the-Expert, an interactive tool that allows SMEs to review instances from a data set and provide labels within a single framework. ASK-the-Expert is powered by an active learning algorithm for training a classifier in the back end. We demonstrate this system in the context of an aviation safety application, but the tool can be adopted to work as a simple review and labeling tool as well, without the use of active learning.
Effectiveness of back-to-back testing
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.; Eckhardt, David E.; Caglayan, Alper; Kelly, John P. J.
1987-01-01
Three models of back-to-back testing processes are described. Two models treat the case where there is no intercomponent failure dependence. The third model describes the more realistic case where there is correlation among the failure probabilities of the functionally equivalent components. The theory indicates that back-to-back testing can, under the right conditions, provide a considerable gain in software reliability. The models are used to analyze the data obtained in a fault-tolerant software experiment. It is shown that the expected gain is indeed achieved, and exceeded, provided the intercomponent failure dependence is sufficiently small. However, even with the relatively high correlation the use of several functionally equivalent components coupled with back-to-back testing may provide a considerable reliability gain. Implications of this finding are that the multiversion software development is a feasible and cost effective approach to providing highly reliable software components intended for fault-tolerant software systems, on condition that special attention is directed at early detection and elimination of correlated faults.
Normative data on the n-back task for children and young adolescents.
Pelegrina, Santiago; Lechuga, M Teresa; García-Madruga, Juan A; Elosúa, M Rosa; Macizo, Pedro; Carreiras, Manuel; Fuentes, Luis J; Bajo, M Teresa
2015-01-01
The n-back task is a frequently used measure of working memory (WM) in cognitive neuroscience research contexts, and it has become widely adopted in other areas over the last decade. This study aimed to obtain normative data for the n-back task from a large sample of children and adolescents. To this end, a computerized verbal n-back task with three levels of WM load (1-back, 2-back, and 3-back) was administered to 3722 Spanish school children aged 7-13 years. Results showed an overall age-related increase in performance for the different levels of difficulty. This trend was less pronounced at 1-back than at 2-back when hits were considered. Gender differences were also observed, with girls outperforming boys although taking more time to respond. The theoretical implications of these results are discussed. Normative data stratified by age and gender for the three WM load levels are provided.
An R-peak detection method that uses an SVD filter and a search back system.
Jung, Woo-Hyuk; Lee, Sang-Goog
2012-12-01
In this paper, we present a method for detecting the R-peak of an ECG signal by using an singular value decomposition (SVD) filter and a search back system. The ECG signal was detected in two phases: the pre-processing phase and the decision phase. The pre-processing phase consisted of the stages for the SVD filter, Butterworth High Pass Filter (HPF), moving average (MA), and squaring, whereas the decision phase consisted of a single stage that detected the R-peak. In the pre-processing phase, the SVD filter removed noise while the Butterworth HPF eliminated baseline wander. The MA removed the remaining noise of the signal that had gone through the SVD filter to make the signal smooth, and squaring played a role in strengthening the signal. In the decision phase, the threshold was used to set the interval before detecting the R-peak. When the latest R-R interval (RRI), suggested by Hamilton et al., was greater than 150% of the previous RRI, the method of detecting the R-peak in such an interval was modified to be 150% or greater than the smallest interval of the two most latest RRIs. When the modified search back system was used, the error rate of the peak detection decreased to 0.29%, compared to 1.34% when the modified search back system was not used. Consequently, the sensitivity was 99.47%, the positive predictivity was 99.47%, and the detection error was 1.05%. Furthermore, the quality of the signal in data with a substantial amount of noise was improved, and thus, the R-peak was detected effectively. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
On the self-organized critical state of Vesuvio volcano
NASA Astrophysics Data System (ADS)
Luongo, G.; Mazzarella, A.; Palumbo, A.
1996-01-01
The catalogue of volcanic earthquakes recorded at Vesuvio (1972-1993) is shown to be complete for events with magnitude enclosed between 1.8 and 3.0. Such a result is converted in significant fractal laws (power laws) relating the distribution of earthquakes to the distribution of energy release, seismic moment, size of fractured zone and linear dimension of faults. The application of the Cantor dust model to time sequence of Vesuvio seismic and eruptive events allows the determination of significant time-clustering fractal structures. In particular, the Vesuvio eruptive activity shows a double-regime process with a stronger clustering on short-time scales than on long-time scales. The complexity of the Vesuvio system does not depend on the number of geological, geophysical and geochemical factors that govern it, but mainly on the number of their interconnections, on the intensity of such linkages and on the feed-back processes. So, all the identified fractal features are taken as evidence that the Vesuvio system is in a self-organized critical state i.e., in a marginally stable state in which a small perturbation can start a chain reaction that can lead to catastrophe. After the catatrophe, the system regulates itself and begins a new cycle, not necessarily periodic, that will end with a successive catastrophe. The variations of the fractal dimension and of the specific scale ranges, in which the fractal behaviour is found to hold, serve as possible volcanic predictors reflecting changes of the same volcanic process.
High Temperatures Health Monitoring of the Condensed Water Height in Steam Pipe Systems
NASA Technical Reports Server (NTRS)
Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Badescu, Mircea; Bao, Xiaoqi; Sherrit, Stewart; Takano, Nobuyuki; Ostlund, Patrick; Blosiu, Julian
2013-01-01
Ultrasonic probes were designed, fabricated and tested for high temperature health monitoring system. The goal of this work was to develop the health monitoring system that can determine the height level of the condensed water through the pipe wall at high temperature up to 250 deg while accounting for the effects of surface perturbation. Among different ultrasonic probe designs, 2.25 MHz probes with air backed configuration provide satisfactory results in terms of sensitivity, receiving reflections from the target through the pipe wall. A series of tests were performed using the air-backed probes under irregular conditions, such as surface perturbation and surface disturbance at elevated temperature, to qualify the developed ultrasonic system. The results demonstrate that the fabricated air-backed probes combined with advanced signal processing techniques offer the capability of health monitoring of steam pipe under various operating conditions.
Status report of the SRT radiotelescope control software: the DISCOS project
NASA Astrophysics Data System (ADS)
Orlati, A.; Bartolini, M.; Buttu, M.; Fara, A.; Migoni, C.; Poppi, S.; Righini, S.
2016-08-01
The Sardinia Radio Telescope (SRT) is a 64-m fully-steerable radio telescope. It is provided with an active surface to correct for gravitational deformations, allowing observations from 300 MHz to 100 GHz. At present, three receivers are available: a coaxial LP-band receiver (305-410 MHz and 1.5-1.8 GHz), a C-band receiver (5.7-7.7 GHz) and a 7-feed K-band receiver (18-26.5 GHz). Several back-ends are also available in order to perform the different data acquisition and analysis procedures requested by scientific projects. The design and development of the SRT control software started in 2004, and now belongs to a wider project called DISCOS (Development of the Italian Single-dish COntrol System), which provides a common infrastructure to the three Italian radio telescopes (Medicina, Noto and SRT dishes). DISCOS is based on the Alma Common Software (ACS) framework, and currently consists of more than 500k lines of code. It is organized in a common core and three specific product lines, one for each telescope. Recent developments, carried out after the conclusion of the technical commissioning of the instrument (October 2013), consisted in the addition of several new features in many parts of the observing pipeline, spanning from the motion control to the digital back-ends for data acquisition and data formatting; we brie y describe such improvements. More importantly, in the last two years we have supported the astronomical validation of the SRT radio telescope, leading to the opening of the first public call for proposals in late 2015. During this period, while assisting both the engineering and the scientific staff, we massively employed the control software and were able to test all of its features: in this process we received our first feedback from the users and we could verify how the system performed in a real-life scenario, drawing the first conclusions about the overall system stability and performance. We examine how the system behaves in terms of network load and system load, how it reacts to failures and errors, and what components and services seem to be the most critical parts of our architecture, showing how the ACS framework impacts on these aspects. Moreover, the exposure to public utilization has highlighted the major flaws in our development and software management process, which had to be tuned and improved in order to achieve faster release cycles in response to user feedback, and safer deploy operations. In this regard we show how the introduction of testing practices, along with continuous integration, helped us to meet higher quality standards. Having identified the most critical aspects of our software, we conclude showing our intentions for the future development of DISCOS, both in terms of software features and software infrastructures.
5. INSTRUMENT ROOM INTERIOR, SHOWING BACKS OF CONSOLE LOCKERS. Looking ...
5. INSTRUMENT ROOM INTERIOR, SHOWING BACKS OF CONSOLE LOCKERS. Looking northeast to firing control room passageway. - Edwards Air Force Base, Air Force Rocket Propulsion Laboratory, Firing Control Building, Test Area 1-100, northeast end of Test Area 1-100 Road, Boron, Kern County, CA
NASA Astrophysics Data System (ADS)
Timoney, Padraig; Kagalwala, Taher; Reis, Edward; Lazkani, Houssam; Hurley, Jonathan; Liu, Haibo; Kang, Charles; Isbester, Paul; Yellai, Naren; Shifrin, Michael; Etzioni, Yoav
2018-03-01
In recent years, the combination of device scaling, complex 3D device architecture and tightening process tolerances have strained the capabilities of optical metrology tools to meet process needs. Two main categories of approaches have been taken to address the evolving process needs. In the first category, new hardware configurations are developed to provide more spectral sensitivity. Most of this category of work will enable next generation optical metrology tools to try to maintain pace with next generation process needs. In the second category, new innovative algorithms have been pursued to increase the value of the existing measurement signal. These algorithms aim to boost sensitivity to the measurement parameter of interest, while reducing the impact of other factors that contribute to signal variability but are not influenced by the process of interest. This paper will evaluate the suitability of machine learning to address high volume manufacturing metrology requirements in both front end of line (FEOL) and back end of line (BEOL) sectors from advanced technology nodes. In the FEOL sector, initial feasibility has been demonstrated to predict the fin CD values from an inline measurement using machine learning. In this study, OCD spectra were acquired after an etch process that occurs earlier in the process flow than where the inline CD is measured. The fin hard mask etch process is known to impact the downstream inline CD value. Figure 1 shows the correlation of predicted CD vs downstream inline CD measurement obtained after the training of the machine learning algorithm. For BEOL, machine learning is shown to provide an additional source of information in prediction of electrical resistance from structures that are not compatible for direct copper height measurement. Figure 2 compares the trench height correlation to electrical resistance (Rs) and the correlation of predicted Rs to the e-test Rs value for a far back end of line (FBEOL) metallization level across 3 products. In the case of product C, it is found that the predicted Rs correlation to the e-test value is significantly improved utilizing spectra acquired at the e-test structure. This paper will explore the considerations required to enable use of machine learning derived metrology output to enable improved process monitoring and control. Further results from the FEOL and BEOL sectors will be presented, together with further discussion on future proliferation of machine learning based metrology solutions in high volume manufacturing.
Magnetic profiling of the San Andreas Fault using a dual magnetometer UAV aerial survey system.
NASA Astrophysics Data System (ADS)
Abbate, J. A.; Angelopoulos, V.; Masongsong, E. V.; Yang, J.; Medina, H. R.; Moon, S.; Davis, P. M.
2017-12-01
Aeromagnetic survey methods using planes are more time-effective than hand-held methods, but can be far more expensive per unit area unless large areas are covered. The availability of low cost UAVs and low cost, lightweight fluxgate magnetometers (FGMs) allows, with proper offset determination and stray fields correction, for low-cost magnetic surveys. Towards that end, we have developed a custom multicopter UAV for magnetic mapping using a dual 3-axis fluxgate magnetometer system: the GEOphysical Drone Enhanced Survey Instrument (GEODESI). A high precision sensor measures the UAV's position and attitude (roll, pitch, and yaw) and is recorded using a custom Arduino data processing system. The two FGMs (in-board and out-board) are placed on two ends of a vertical 1m boom attached to the base of the UAV. The in-board FGM is most sensitive to stray fields from the UAV and its signal is used, after scaling, to clean the signal of the out-board FGM from the vehicle noise. The FGMs record three orthogonal components of the magnetic field in the UAV body coordinates which are then transformed into a north-east-down coordinate system using a rotation matrix determined from the roll-pitch-yaw attitude data. This ensures knowledge of the direction of all three field components enabling us to perform inverse modeling of magnetic anomalies with greater accuracy than total or vertical field measurements used in the past. Field tests were performed at Dragon's Back Pressure Ridge in the Carrizo Plain of California, where there is a known crossing of the San Andreas Fault. Our data and models were compared to previously acquired LiDAR and hand-held magnetometer measurements. Further tests will be carried out to solidify our results and streamline our processing for educational use in the classroom and student field training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wojtczuk , S.
2011-06-01
Spire Semiconductor made concentrator photovoltaic (CPV) cells using a new bi-facial growth process and met both main program goals: a) 42.5% efficiency 500X (AM1.5D, 25C, 100mW/cm2); and b) Ready to supply at least 3MW/year of such cells at end of program. We explored a unique simple fabrication process to make a N/P 3-junction InGaP/GaAs/InGaAs tandem cells . First, the InGaAs bottom cell is grown on the back of a GaAs wafer. The wafers are then loaded into a cassette, spin-rinsed to remove particles, dipped in dilute NH4OH and spin-dried. The wafers are then removed from the cassette loaded the reactormore » for GaAs middle and InGaP top cell growth on the opposite wafer face (bi-facial growth). By making the epitaxial growth process a bit more complex, we are able to avoid more complex processing (such as large area wafer bonding or epitaxial liftoff) used in the inverted metamorphic (IMM) approach to make similar tandem stacks. We believe the yield is improved compared to an IMM process. After bi-facial epigrowth, standard III-V cell steps (back metal, photolithography for front grid, cap etch, AR coat, dice) are used in the remainder of the process.« less
NASA Astrophysics Data System (ADS)
Tsai, Yi-Pei; Hsieh, Ting-Huan; Lin, Chrong Jung; King, Ya-Chin
2017-09-01
A novel device for monitoring plasma-induced damage in the back-end-of-line (BEOL) process with charge splitting capability is first-time proposed and demonstrated. This novel charge splitting in situ recorder (CSIR) can independently trace the amount and polarity of plasma charging effects during the manufacturing process of advanced fin field-effect transistor (FinFET) circuits. Not only does it reveal the real-time and in situ plasma charging levels on the antennas, but it also separates positive and negative charging effect and provides two independent readings. As CMOS technologies push for finer metal lines in the future, the new charge separation scheme provides a powerful tool for BEOL process optimization and further device reliability improvements.
Flows of engineered nanomaterials through the recycling process in Switzerland.
Caballero-Guzman, Alejandro; Sun, Tianyin; Nowack, Bernd
2015-02-01
The use of engineered nanomaterials (ENMs) in diverse applications has increased during the last years and this will likely continue in the near future. As the number of applications increase, more and more waste with nanomaterials will be generated. A portion of this waste will enter the recycling system, for example, in electronic products, textiles and construction materials. The fate of these materials during and after the waste management and recycling operations is poorly understood. The aim of this work is to model the flows of nano-TiO2, nano-ZnO, nano-Ag and CNT in the recycling system in Switzerland. The basis for this study is published information on the ENMs flows on the Swiss system. We developed a method to assess their flow after recycling. To incorporate the uncertainties inherent to the limited information available, we applied a probabilistic material flow analysis approach. The results show that the recycling processes does not result in significant further propagation of nanomaterials into new products. Instead, the largest proportion will flow as waste that can subsequently be properly handled in incineration plants or landfills. Smaller fractions of ENMs will be eliminated or end up in materials that are sent abroad to undergo further recovery processes. Only a reduced amount of ENMs will flow back to the productive process of the economy in a limited number of sectors. Overall, the results suggest that risk assessment during recycling should focus on occupational exposure, release of ENMs in landfills and incineration plants, and toxicity assessment in a small number of recycled inputs. Copyright © 2014 Elsevier Ltd. All rights reserved.
Information adaptive system of NEEDS. [of NASA End to End Data System
NASA Technical Reports Server (NTRS)
Howle, W. M., Jr.; Kelly, W. L.
1979-01-01
The NASA End-to-End Data System (NEEDS) program was initiated by NASA to improve significantly the state of the art in acquisition, processing, and distribution of space-acquired data for the mid-1980s and beyond. The information adaptive system (IAS) is a program element under NEEDS Phase II which addresses sensor specific processing on board the spacecraft. The IAS program is a logical first step toward smart sensors, and IAS developments - particularly the system components and key technology improvements - are applicable to future smart efforts. The paper describes the design goals and functional elements of the IAS. In addition, the schedule for IAS development and demonstration is discussed.
To Hanoi and Back: The United States Air Force and North Vietnam, 1966-1973
2000-01-01
Argument Without End : In Search of Answers to the Vietnam Tragedy (New...Thomas J. Biersteker and Col. Herbert Y. Schandler, Argument Without End : In Search of Answers to the Vietnam Tragedy (New York, 1999), pp 278–83. cluster...Schandler, Argument Without End : In Search of
A Module Experimental Process System Development Unit (MEPSDU). [flat plate solar arrays
NASA Technical Reports Server (NTRS)
1981-01-01
The development of a cost effective process sequence that has the potential for the production of flat plate photovoltaic modules which meet the price goal in 1986 of 70 cents or less per Watt peak is described. The major accomplishments include (1) an improved AR coating technique; (2) the use of sand blast back clean-up to reduce clean up costs and to allow much of the Al paste to serve as a back conductor; and (3) the development of wave soldering for use with solar cells. Cells were processed to evaluate different process steps, a cell and minimodule test plan was prepared and data were collected for preliminary Samics cost analysis.
MassCascade: Visual Programming for LC-MS Data Processing in Metabolomics.
Beisken, Stephan; Earll, Mark; Portwood, David; Seymour, Mark; Steinbeck, Christoph
2014-04-01
Liquid chromatography coupled to mass spectrometry (LC-MS) is commonly applied to investigate the small molecule complement of organisms. Several software tools are typically joined in custom pipelines to semi-automatically process and analyse the resulting data. General workflow environments like the Konstanz Information Miner (KNIME) offer the potential of an all-in-one solution to process LC-MS data by allowing easy integration of different tools and scripts. We describe MassCascade and its workflow plug-in for processing LC-MS data. The Java library integrates frequently used algorithms in a modular fashion, thus enabling it to serve as back-end for graphical front-ends. The functions available in MassCascade have been encapsulated in a plug-in for the workflow environment KNIME, allowing combined use with e.g. statistical workflow nodes from other providers and making the tool intuitive to use without knowledge of programming. The design of the software guarantees a high level of modularity where processing functions can be quickly replaced or concatenated. MassCascade is an open-source library for LC-MS data processing in metabolomics. It embraces the concept of visual programming through its KNIME plug-in, simplifying the process of building complex workflows. The library was validated using open data.
Two Seating Systems' Effects on an Adolescent With Cerebral Palsy and Severe Scoliosis.
Lephart, Kim; Kaplan, Sandra L
2015-01-01
To compare physiological functioning, communication switch activation, and response accuracy in a 19-year-old young man with quadriplegic cerebral palsy and neurological scoliosis using 2 seating systems within the school setting. Prospective single-subject alternating treatment design with 2 conditions: baseline phase with standard planar inserts (A1), custom-molded back with original seat (B), and return to baseline (A2). Measures included oxygen saturation (SaO2), heart rate (HR), respiration rate (RR), body temperature (BT), processing time to activate switches, and response accuracy. SaO2 levels increased from "distressed" to "normal"; variability decreased. HR, RR, and BT fluctuations decreased with the custom-molded back. Processing time decreased with increased variability, affected by subject's motivation; accuracy improved slightly. Reported social approachability and student-initiated communication increased. SaO2 increased and HR, RR, and BT fluctuations decreased with a custom-molded back. Graphing data may help determine seating effect with complex clients.
Air flow quality analysis of modenas engine exhaust system
NASA Astrophysics Data System (ADS)
Shahriman A., B.; Mohamad Syafiq A., K.; Hashim, M. S. M.; Razlan, Zuradzman M.; Khairunizam W. A., N.; Hazry, D.; Afendi, Mohd; Daud, R.; Rahman, M. D. Tasyrif Abdul; Cheng, E. M.; Zaaba, S. K.
2017-09-01
The simulation process being conducted to determine the air flow effect between the original exhaust system and modified exhaust system. The simulations are conducted to investigate the flow distribution of exhaust gases that will affect the performance of the engine. The back flow pressure in the original exhaust system is predicted toward this simulation. The design modification to the exhaust port, exhaust pipe, and exhaust muffler has been done during this simulation to reduce the back flow effect. The new designs are introduced by enlarging the diameter of the exhaust port, enlarge the diameter of the exhaust pipe and created new design for the exhaust muffler. Based on the result obtained, there the pulsating flow form at the original exhaust port that will increase the velocity and resulting the back pressure occur. The result for new design of exhaust port, the velocity is lower at the valve guide in the exhaust port. New design muffler shows that the streamline of the exhaust flow move smoothly compare to the original muffler. It is proved by using the modification exhaust system, the back pressure are reduced and the engine performance can be improve.
Hernandez, Alejandra; Gross, Karlie; Gombatto, Sara
2017-08-01
When functional movements are impaired in people with low back pain, they may be a contributing factor to chronicity and recurrence. The purpose of the current study was to examine lumbar spine, pelvis, and lower extremity kinematics during a step down functional task between people with and without a history of low back pain. A 3-dimensional motion capture system was used to analyze kinematics during a step down task. Total excursion of the lumbar spine, pelvis, and lower extremity segments in each plane were calculated from the start to end of the task. Separate analysis of variance tests (α=0.05) were conducted to determine the effect of independent variables of group and plane on lumbar spine, pelvis, and lower extremity kinematics. An exploratory analysis was conducted to examine kinematic differences among movement-based low back pain subgroups. Subjects with low back pain displayed less lumbar spine movement than controls across all three planes of movement (P-values=0.001-0.043). This group difference was most pronounced in the sagittal plane. For the lower extremity, subjects with low back pain displayed more frontal and axial plane knee movement than controls (P-values=0.001). There were no significant differences in kinematics among movement-based low back pain subgroups. People with low back pain displayed less lumbar region movement in the sagittal plane and more off-plane knee movements than the control group during a step down task. Clinicians can use this information when assessing lumbar spine and lower extremity movement during functional tasks, with the goal of developing movement-based interventions. Copyright © 2017 Elsevier Ltd. All rights reserved.
"Dead End Kids in Dead End Jobs"? Reshaping Debates on Young People in Jobs without Training
ERIC Educational Resources Information Center
Quinn, Jocey; Lawy, Robert; Diment, Kim
2008-01-01
Young people who are in "jobs without training" (JWT) are commonly seen as "dead end kids in dead end jobs". They have been identified as a problem group who need to be encouraged back into formal education and training. Following the Leitch report and the new policy goal to involve all young people in education and training up…
Just, Beth Haenke; Marc, David; Munns, Megan; Sandefer, Ryan
2016-01-01
Patient identification matching problems are a major contributor to data integrity issues within electronic health records. These issues impede the improvement of healthcare quality through health information exchange and care coordination, and contribute to deaths resulting from medical errors. Despite best practices in the area of patient access and medical record management to avoid duplicating patient records, duplicate records continue to be a significant problem in healthcare. This study examined the underlying causes of duplicate records using a multisite data set of 398,939 patient records with confirmed duplicates and analyzed multiple reasons for data discrepancies between those record matches. The field that had the greatest proportion of mismatches (nondefault values) was the middle name, accounting for 58.30 percent of mismatches. The Social Security number was the second most frequent mismatch, occurring in 53.54 percent of the duplicate pairs. The majority of the mismatches in the name fields were the result of misspellings (53.14 percent in first name and 33.62 percent in last name) or swapped last name/first name, first name/middle name, or last name/middle name pairs. The use of more sophisticated technologies is critical to improving patient matching. However, no amount of advanced technology or increased data capture will completely eliminate human errors. Thus, the establishment of policies and procedures (such as standard naming conventions or search routines) for front-end and back-end staff to follow is foundational for the overall data integrity process. Training staff on standard policies and procedures will result in fewer duplicates created on the front end and more accurate duplicate record matching and merging on the back end. Furthermore, monitoring, analyzing trends, and identifying errors that occur are proactive ways to identify data integrity issues. PMID:27134610
Vandevijvere, Stefanie; Williams, Rachel; Tawfiq, Essa; Swinburn, Boyd
2017-11-14
This study developed a systems-based approach (called FoodBack) to empower citizens and change agents to create healthier community food places. Formative evaluations were held with citizens and change agents in six diverse New Zealand communities, supplemented by semi-structured interviews with 85 change agents in Auckland and Hamilton in 2015-2016. The emerging system was additionally reviewed by public health experts from diverse organizations. A food environments feedback system was constructed to crowdsource key indicators of the healthiness of diverse community food places (i.e. schools, hospitals, supermarkets, fast food outlets, sport centers) and outdoor spaces (i.e. around schools), comments/pictures about barriers and facilitators to healthy eating and exemplar stories on improving the healthiness of food environments. All the information collected is centrally processed and translated into 'short' (immediate) and 'long' (after analyses) feedback loops to stimulate actions to create healthier food places. FoodBack, as a comprehensive food environment feedback system (with evidence databases and feedback and recognition processes), has the potential to increase food sovereignty, and generate a sustainable, fine-grained database of food environments for real-time food policy research. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Towards a Reduced-Wire Interface for CMUT-Based Intravascular Ultrasound Imaging Systems
Lim, Jaemyung; Tekes, Coskun; Degertekin, F. Levent; Ghovanloo, Maysam
2016-01-01
Having intravascular ultrasound (IVUS) imaging capability on guide wires used in cardiovascular interventions may eliminate the need for separate IVUS catheters and expand the use of IVUS in a larger portion of the vasculature. High frequency capacitive micro machined ultrasonic transducer (CMUT) arrays should be integrated with interface electronics and placed on the guide wire for this purpose. Besides small size, this system-on-a-chip (SoC) front-end should connect to the back-end imaging system with a minimum number of wires to preserve the critical mechanical properties of the guide wire. We present a 40 MHz CMUT array interface SoC, which will eventually use only two wires for power delivery and transmits image data using a combination of analog-to-time conversion (ATC) and an impulse radio ultra-wideband (IR-UWB) wireless link. The proof-of-concept prototype ASIC consumes only 52.8 mW and occupies 4.07 mm2 in a 0.35-μm standard CMOS process. A rectifier and regulator power the rest of the SoC at 3.3 V from a 10 MHz power carrier that is supplied through a 2.4 m micro-coax cable with an overall efficiency of 49.1%. Echo signals from an 8-element CMUT array are amplified by a transimpedance amplifier (TIA) array and down-converted to baseband by quadrature sampling using a 40 MHz clock, derived from the power carrier. The ATC generates pulse-width-modulated (PWM) samples at 2 × 10 MS/s with 6 bit resolution, while the entire system achieved 5.1 ENOB. Preliminary images from the prototype system are presented, and alternative data transmission and possible future directions towards practical implementation are discussed. PMID:27662686
Towards a Reduced-Wire Interface for CMUT-Based Intravascular Ultrasound Imaging Systems.
Lim, Jaemyung; Tekes, Coskun; Degertekin, F Levent; Ghovanloo, Maysam
2017-04-01
Having intravascular ultrasound (IVUS) imaging capability on guide wires used in cardiovascular interventions may eliminate the need for separate IVUS catheters and expand the use of IVUS in a larger portion of the vasculature. High frequency capacitive micro machined ultrasonic transducer (CMUT) arrays should be integrated with interface electronics and placed on the guide wire for this purpose. Besides small size, this system-on-a-chip (SoC) front-end should connect to the back-end imaging system with a minimum number of wires to preserve the critical mechanical properties of the guide wire. We present a 40 MHz CMUT array interface SoC, which will eventually use only two wires for power delivery and transmits image data using a combination of analog-to-time conversion (ATC) and an impulse radio ultra-wideband (IR-UWB) wireless link. The proof-of-concept prototype ASIC consumes only 52.8 mW and occupies 4.07 [Formula: see text] in a 0.35- [Formula: see text] standard CMOS process. A rectifier and regulator power the rest of the SoC at 3.3 V from a 10 MHz power carrier that is supplied through a 2.4 m micro-coax cable with an overall efficiency of 49.1%. Echo signals from an 8-element CMUT array are amplified by a transimpedance amplifier (TIA) array and down-converted to baseband by quadrature sampling using a 40 MHz clock, derived from the power carrier. The ATC generates pulse-width-modulated (PWM) samples at 2 × 10 MS/s with 6 bit resolution, while the entire system achieved 5.1 ENOB. Preliminary images from the prototype system are presented, and alternative data transmission and possible future directions towards practical implementation are discussed.
View of Commander (CDR) Scott Altman working on the Flight Deck
2009-05-21
S125-E-013081 (21 May 2009) --- Occupying the commander?s station, astronaut Scott Altman, STS-125 commander, uses the Portable In-Flight Landing Operations Trainer (PILOT) on the flight deck of the Earth-orbiting Space Shuttle Atlantis. PILOT consists of a laptop computer and a joystick system, which helps to maintain a high level of proficiency for the end-of-mission approach and landing tasks required to bring the shuttle safely back to Earth.
View of STS-125 Crew Members working on the Flight Deck
2009-05-21
S125-E-013050 (21 May 2009) --- Occupying the commander?s station, astronaut Gregory C. Johnson, STS-125 pilot, uses the Portable In-Flight Landing Operations Trainer (PILOT) on the flight deck of the Earth-orbiting Space Shuttle Atlantis. PILOT consists of a laptop computer and a joystick system, which helps to maintain a high level of proficiency for the end-of-mission approach and landing tasks required to bring the shuttle safely back to Earth.
View of Pilot Gregory Johnson working on the Flight Deck
2009-05-21
S125-E-013040 (21 May 2009) --- Occupying the commander?s station, astronaut Gregory C. Johnson, STS-125 pilot, uses the Portable In-Flight Landing Operations Trainer (PILOT) on the flight deck of the Earth-orbiting Space Shuttle Atlantis. PILOT consists of a laptop computer and a joystick system, which helps to maintain a high level of proficiency for the end-of-mission approach and landing tasks required to bring the shuttle safely back to Earth.
A web system of virtual morphometric globes for Mars and the Moon
NASA Astrophysics Data System (ADS)
Florinsky, I. V.; Garov, A. S.; Karachevtseva, I. P.
2018-09-01
We developed a web system of virtual morphometric globes for Mars and the Moon. As the initial data, we used 15-arc-minutes gridded global digital elevation models (DEMs) extracted from the Mars Orbiter Laser Altimeter (MOLA) and the Lunar Orbiter Laser Altimeter (LOLA) gridded archives. We derived global digital models of sixteen morphometric variables including horizontal, vertical, minimal, and maximal curvatures, as well as catchment area and topographic index. The morphometric models were integrated into the web system developed as a distributed application consisting of a client front-end and a server back-end. The following main functions are implemented in the system: (1) selection of a morphometric variable; (2) two-dimensional visualization of a calculated global morphometric model; (3) 3D visualization of a calculated global morphometric model on the sphere surface; (4) change of a globe scale; and (5) globe rotation by an arbitrary angle. Free, real-time web access to the system is provided. The web system of virtual morphometric globes can be used for geological and geomorphological studies of Mars and the Moon at the global, continental, and regional scales.
Demonstration of the Potential of Magnetic Tunnel Junctions for a Universal RAM Technology
NASA Astrophysics Data System (ADS)
Gallagher, William J.
2000-03-01
Over the past four years, tunnel junctions with magnetic electrodes have emerged as promising devices for future magnetoresistive sensing and for information storage. This talk will review advances in these devices, focusing particularly on the use of magnetic tunnel junctions for magnetic random access memory (MRAM). Exchange-biased versions of magnetic tunnel junctions (MTJs) in particular will be shown to have useful properties for forming magnetic memory storage elements in a novel cross-point architecture. Exchange-biased MTJ elements have been made with areas as small as 0.1 square microns and have shown magnetoresistance values exceeding 40 The potential of exchange-biased MTJs for MRAM has been most seriously explored in a demonstration experiment involving the integration of 0.25 micron CMOS technology with a special magnetic tunnel junction "back end." The magnetic back end is based upon multi-layer magnetic tunnel junction growth technology which was developed using research-scale equipment and one-inch size substrates. For the demonstration, the CMOS wafers processed through two metal layers were cut into one-inch squares for depositions of bottom-pinned exchange-biased magnetic tunnel junctions. The samples were then processed through four additional lithographic levels to complete the circuits. The demonstration focused attention on a number of processing and device issues that were addressed successfully enough that key performance aspects of MTJ MRAM were demonstrated in 1 K bit arrays, including reads and writes in less than 10 ns and nonvolatility. While other key issues remain to be addressed, these results suggest that MTJ MRAM might simultaneously provide much of the functionality now provided separately by SRAM, DRAM, and NVRAM.
High stability wavefront reference source
Feldman, M.; Mockler, D.J.
1994-05-03
A thermally and mechanically stable wavefront reference source which produces a collimated output laser beam is disclosed. The output beam comprises substantially planar reference wavefronts which are useful for aligning and testing optical interferometers. The invention receives coherent radiation from an input optical fiber, directs a diverging input beam of the coherent radiation to a beam folding mirror (to produce a reflected diverging beam), and collimates the reflected diverging beam using a collimating lens. In a class of preferred embodiments, the invention includes a thermally and mechanically stable frame comprising rod members connected between a front end plate and a back end plate. The beam folding mirror is mounted on the back end plate, and the collimating lens mounted to the rods between the end plates. The end plates and rods are preferably made of thermally stable metal alloy. Preferably, the input optical fiber is a single mode fiber coupled to an input end of a second single mode optical fiber that is wound around a mandrel fixedly attached to the frame of the apparatus. The output end of the second fiber is cleaved so as to be optically flat, so that the input beam emerging therefrom is a nearly perfect diverging spherical wave. 7 figures.
High stability wavefront reference source
Feldman, Mark; Mockler, Daniel J.
1994-01-01
A thermally and mechanically stable wavefront reference source which produces a collimated output laser beam. The output beam comprises substantially planar reference wavefronts which are useful for aligning and testing optical interferometers. The invention receives coherent radiation from an input optical fiber, directs a diverging input beam of the coherent radiation to a beam folding mirror (to produce a reflected diverging beam), and collimates the reflected diverging beam using a collimating lens. In a class of preferred embodiments, the invention includes a thermally and mechanically stable frame comprising rod members connected between a front end plate and a back end plate. The beam folding mirror is mounted on the back end plate, and the collimating lens mounted to the rods between the end plates. The end plates and rods are preferably made of thermally stable metal alloy. Preferably, the input optical fiber is a single mode fiber coupled to an input end of a second single mode optical fiber that is wound around a mandrel fixedly attached to the frame of the apparatus. The output end of the second fiber is cleaved so as to be optically flat, so that the input beam emerging therefrom is a nearly perfect diverging spherical wave.
Prue-Owens, Kathy; Watkins, Miko; Wolgast, Kelly A
2011-01-01
The Patient CaringTouch System emerged from a comprehensive assessment and gap analysis of clinical nursing capabilities in the Army. The Patient CaringTouch System now provides the framework and set of standards by which we drive excellence in quality nursing care for our patients and excellence in quality of life for our nurses in Army Medicine. As part of this enterprise transformation, we placed particular emphasis on the delivery of nursing care at the bedside as well as the integration of a formal professional peer feedback process in support of individual nurse practice enhancement. The Warrior Care Imperative Action Team was chartered to define and establish the standards for care teams in the clinical settings and the process by which we established formal peer feedback for our professional nurses. This back-to-basics approach is a cornerstone of the Patient CaringTouch System implementation and sustainment.
An end-to-end communications architecture for condition-based maintenance applications
NASA Astrophysics Data System (ADS)
Kroculick, Joseph
2014-06-01
This paper explores challenges in implementing an end-to-end communications architecture for Condition-Based Maintenance Plus (CBM+) data transmission which aligns with the Army's Network Modernization Strategy. The Army's Network Modernization strategy is based on rolling out network capabilities which connect the smallest unit and Soldier level to enterprise systems. CBM+ is a continuous improvement initiative over the life cycle of a weapon system or equipment to improve the reliability and maintenance effectiveness of Department of Defense (DoD) systems. CBM+ depends on the collection, processing and transport of large volumes of data. An important capability that enables CBM+ is an end-to-end network architecture that enables data to be uploaded from the platform at the tactical level to enterprise data analysis tools. To connect end-to-end maintenance processes in the Army's supply chain, a CBM+ network capability can be developed from available network capabilities.
VGOS Operations and Geodetic Results
NASA Astrophysics Data System (ADS)
Niell, Arthur E.; Beaudoin, Christopher J.; Bolotin, Sergei; Cappallo, Roger J.; Corey, Brian E.; Gipson, John; Gordon, David; McWhirter, Russell; Ruszczyk, Chester A.; SooHoo, Jason
2014-12-01
Over the past two years the first VGOS geodetic results were obtained using the GGAO12M and Westford broadband systems that have been developed under NASA sponsorship and funding. These observations demonstrated full broadband operation, from data acquisition through correlation, delay extraction, and baseline estimation. The May 2013 24-hour session proceeded almost without human intervention in anticipation of the goal of unattended operation. A recent test observation successfully demonstrated the use of what is expected to be the operational version of the RDBE digital back end and the Mark 6 system on which the outputs of four RDBEs, each processing one RF band, were recorded on a single module at eight gigabits per second. The complex-sample VDIF data from GGAO12M and Westford were cross-correlated on the Haystack DiFX software correlator, and the instrumental delay was calculated from all of the phase calibration tones in each channel. A minimum redundancy frequency sequence (1, 2, 4, 6, 9, 13, 14, 15) was utilized to minimize the first sidelobes of the multiband delay resolution function.
Just working with the cellular machine: A high school game for teaching molecular biology.
Cardoso, Fernanda Serpa; Dumpel, Renata; da Silva, Luisa B Gomes; Rodrigues, Carlos R; Santos, Dilvani O; Cabral, Lucio Mendes; Castro, Helena C
2008-03-01
Molecular biology is a difficult comprehension subject due to its high complexity, thus requiring new teaching approaches. Herein, we developed an interdisciplinary board game involving the human immune system response against a bacterial infection for teaching molecular biology at high school. Initially, we created a database with several questions and a game story that invites the students for helping the human immunological system to produce antibodies (IgG) and fight back a pathogenic bacterium second-time invasion. The game involves answering questions completing the game board in which the antibodies "are synthesized" through the molecular biology process. At the end, a problem-based learning approach is used, and a last question is raised about proteins. Biology teachers and high school students evaluated the game and considered it an easy and interesting tool for teaching the theme. An increase of about 5-30% in answering molecular biology questions revealed that the game improves learning and induced a more engaged and proactive learning profile in the high school students. Copyright © 2008 International Union of Biochemistry and Molecular Biology, Inc.
Using task analysis to understand the Data System Operations Team
NASA Technical Reports Server (NTRS)
Holder, Barbara E.
1994-01-01
The Data Systems Operations Team (DSOT) currently monitors the Multimission Ground Data System (MGDS) at JPL. The MGDS currently supports five spacecraft and within the next five years, it will support ten spacecraft simultaneously. The ground processing element of the MGDS consists of a distributed UNIX-based system of over 40 nodes and 100 processes. The MGDS system provides operators with little or no information about the system's end-to-end processing status or end-to-end configuration. The lack of system visibility has become a critical issue in the daily operation of the MGDS. A task analysis was conducted to determine what kinds of tools were needed to provide DSOT with useful status information and to prioritize the tool development. The analysis provided the formality and structure needed to get the right information exchange between development and operations. How even a small task analysis can improve developer-operator communications is described, and the challenges associated with conducting a task analysis in a real-time mission operations environment are examined.
Transfer after Dual n-Back Training Depends on Striatal Activation Change.
Salminen, Tiina; Kühn, Simone; Frensch, Peter A; Schubert, Torsten
2016-09-28
The dual n-back working memory (WM) training paradigm (comprising auditory and visual stimuli) has gained much attention since studies have shown widespread transfer effects. By including a multimodal dual-task component, the task is demanding to the human cognitive system. We investigated whether dual n-back training improves general cognitive resources or a task-specific WM updating process in participants. We expected: (1) widespread transfer effects and the recruitment of a common neuronal network by the training and the transfer tasks and (2) narrower transfer results and that a common activation network alone would not produce transfer, but instead an activation focus on the striatum, which is associated with WM updating processes. The training group showed transfer to an untrained dual-modality WM updating task, but not to single-task versions of the training or the transfer task. They also showed diminished neuronal overlap between the training and the transfer task from pretest to posttest and an increase in striatal activation in both tasks. Furthermore, we found an association between the striatal activation increase and behavioral improvement. The control groups showed no transfer and no change in the amount of activation overlap or in striatal activation from pretest to posttest. We conclude that, instead of improving general cognitive resources (which would have required a transfer effect to all transfer tasks and that a frontal activation overlap between the tasks produced transfer), dual n-back training improved a task-specific process: WM updating of stimuli from two modalities. The current study allows for a better understanding of the cognitive and neural effects of working memory (WM) training and transfer. It shows that dual n-back training mainly improves specific processes of WM updating, and this improvement leads to narrow transfer effects to tasks involving the same processes. On a neuronal level this is accompanied by increased neural activation in the striatum that is related to WM updating. The current findings challenge the view that dual n-back training provokes a general boosting of the WM system and of its neural underpinnings located in frontoparietal brain regions. Instead, the findings imply the relevance of task-specific brain regions which are involved in important cognitive processes during training and transfer tasks. Copyright © 2016 the authors 0270-6474/16/3610198-16$15.00/0.
Overview of waste reduction techniques leading to pollution prevention
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunt, G.E.
Liquid, solid, and/or gaseous waste materials are always generated during the manufacture of any product. In addition to creating environmental hazards, these wastes represent losses of valuable materials and energy from the production process and require a significant investment in pollution control. Traditionally, pollution control relies on ``end-of-the-pipe`` and ``out-the-back-door`` management approaches that require labor hours, energy, materials, and capital expenditures. Such an approach removes pollutants from one source, such as wastewater, but places them somewhere else, such as in a landfill. More regulations, higher disposal expenses, increased liability costs, and increased public awareness have caused industrial and governmental leadersmore » to begin critical examinations of end-of-the-pipe control technologies. The value of reducing waste during the manufacturing process has become apparent to many industries. These companies are looking at broader environmental management objectives, rather than concentrating solely on pollution control. Waste reduction not only is very often economically beneficial for an industry, it also improves the quality of the environment.« less
CamBAfx: Workflow Design, Implementation and Application for Neuroimaging
Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John
2009-01-01
CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470
NASA Technical Reports Server (NTRS)
Powell, E. A.; Zinn, B. T.
1973-01-01
An analytical technique is developed to solve nonlinear three-dimensional, transverse and axial combustion instability problems associated with liquid-propellant rocket motors. The Method of Weighted Residuals is used to determine the nonlinear stability characteristics of a cylindrical combustor with uniform injection of propellants at one end and a conventional DeLaval nozzle at the other end. Crocco's pressure sensitive time-lag model is used to describe the unsteady combustion process. The developed model predicts the transient behavior and nonlinear wave shapes as well as limit-cycle amplitudes and frequencies typical of unstable motor operation. The limit-cycle amplitude increases with increasing sensitivity of the combustion process to pressure oscillations. For transverse instabilities, calculated pressure waveforms exhibit sharp peaks and shallow minima, and the frequency of oscillation is within a few percent of the pure acoustic mode frequency. For axial instabilities, the theory predicts a steep-fronted wave moving back and forth along the combustor.
Uranium nitride fuel fabrication for SP-100 reactors
NASA Technical Reports Server (NTRS)
Mason, Richard E.; Chidester, Kenneth M.; Hoth, Carl W.; Matthews, Bruce R.
1987-01-01
Fuel pins of uranium mononitride clad in Nb-1 percent Zr were fabricated for irradiation tests in EBR-II. Laboratory scale process parameters to synthesize UN powders and fabricate UN pellets were developed. Uranium mononitride was prepared by converting UO2 to UN. Fuel pellets were prepared by communition of UN briquettes, uniaxial pressing, and high temperature sintering. Techniques for machining, cleaning, and welding Nb-1 percent Zr cladding components were developed. End caps were electron beam welded to the tubing. Helium back-fill holes were sealed with a laser weld.
Uranium nitride fuel fabrication for SP-100 reactors
NASA Astrophysics Data System (ADS)
Mason, Richard E.; Chidester, Kenneth M.; Hoth, Carl W.; Matthews, Bruce R.
Fuel pins of uranium mononitride clad in Nb-1 percent Zr were fabricated for irradiation tests in EBR-II. Laboratory scale process parameters to synthesize UN powders and fabricate UN pellets were developed. Uranium mononitride was prepared by converting UO2 to UN. Fuel pellets were prepared by communition of UN briquettes, uniaxial pressing, and high temperature sintering. Techniques for machining, cleaning, and welding Nb-1 percent Zr cladding components were developed. End caps were electron beam welded to the tubing. Helium back-fill holes were sealed with a laser weld.
15 pixels digital autocorrelation spectrometer system
NASA Astrophysics Data System (ADS)
Lee, Changhoon; Kim, Hyo-Ryung; Kim, Kwang-Dong; Chung, Mun-Hee; Timoc, C.
2006-06-01
In this paper describes the system configuration and the some performance test results of the 15 pixels digital autocorrelation spectrometer to be used at the Taeduk Radio Astronomy Observatory (TRAO) of Korea. This autocorrelation spectrometer instrument enclosed in a 3-slot VXI module and controlled via a USB port by a backend PC. This spectrometer system consists of the 4 band-pass filters unit, the digitizer, the 512 lags correlator, the clock distribution unit, and USB controller. And here we describe the frequency accuracy and the root-mean-square noise characteristic of this spectrometer. After some calibration procedure, this spectrometer can be use as the back-end system at TRAO for the 3x5 focal plane array receivers.
Jäger, Jörg M; Schöllhorn, Wolfgang I
2012-04-01
Offensive and defensive systems of play represent important aspects of team sports. They include the players' positions at certain situations during a match, i.e., when players have to be on specific positions on the court. Patterns of play emerge based on the formations of the players on the court. Recognition of these patterns is important to react adequately and to adjust own strategies to the opponent. Furthermore, the ability to apply variable patterns of play seems to be promising since they make it harder for the opponent to adjust. The purpose of this study is to identify different team tactical patterns in volleyball and to analyze differences in variability. Overall 120 standard situations of six national teams in women's volleyball are analyzed during a world championship tournament. Twenty situations from each national team are chosen, including the base defence position (start configuration) and the two players block with middle back deep (end configuration). The shapes of the defence formations at the start and end configurations during the defence of each national team as well as the variability of these defence formations are statistically analyzed. Furthermore these shapes data are used to train multilayer perceptrons in order to test whether artificial neural networks can recognize the teams by their tactical patterns. Results show significant differences between the national teams in both the base defence position at the start and the two players block with middle back deep at the end of the standard defence situation. Furthermore, the national teams show significant differences in variability of the defence systems and start-positions are more variable than the end-positions. Multilayer perceptrons are able to recognize the teams at an average of 98.5%. It is concluded that defence systems in team sports are highly individual at a competitive level and variable even in standard situations. Artificial neural networks can be used to recognize teams by the shapes of the players' configurations. These findings support the concept that tactics and strategy have to be adapted for the team and need to be flexible in order to be successful. Copyright © 2010 Elsevier B.V. All rights reserved.
Some fundamental questions about the evolution of the Sea of Japan back-arc
NASA Astrophysics Data System (ADS)
Van Horne, A.; Sato, H.; Ishiyama, T.
2016-12-01
The Japanese island arc separated from Asia through the rifting of an active continental margin, and the opening of the Sea of Japan back-arc, in the middle Miocene. Due to its complex tectonic setting, the Sea of Japan back-arc was affected by multiple external events contemporary with its opening, including a plate reorganization, the opening of at least two other nearby back-arcs (Shikoku Basin and Okhotsk Sea/Kuril Basin), and two separate arc-arc collisions, involving encroachment upon Japan of the Izu-Bonin and Kuril arcs. Recent tectonic inversion has exposed entire sequences of back-arc structure on land, which remain virtually intact because of the short duration of inversion. Japan experiences a high level of seismic activity due to its position on the overriding plate of an active subduction margin. Continuous geophysical monitoring via a dense nationwide seismic/geodetic network, and a program of controlled-source refraction/wide-angle reflection profiling, directed towards earthquake hazard mitigation, have made it the repository of a rich geophysical data set through which to understand the processes that have shaped back-arc development. Timing, structural evolution, and patterns of magmatic activity during back-arc opening in the Sea of Japan were established by earlier investigations, but fundamental questions regarding back-arc development remain outstanding. These include (1) timing of the arrival of the Philippine Sea plate in southwest Japan, (2) the nature of the plate boundary prior to its arrival, (3) the pre-rift location of the Japanese island arc when it was attached to Asia, (4) the mechanism of back-arc opening (pull-apart or trench retreat), (5) the speed of opening, (6) simultaneous or sequential development of the multi-rift system, (7) the origin of the anomalously thick Yamato Basin ocean crust, and (8) the pattern of concentrated deformation in the failed-rift system of the eastern Sea of Japan since tectonic inversion. Resolving uncertainties like those posed here will be necessary for a more complete understanding of the nature of and processes involved in back-arc development in the Sea of Japan.
On I/O Virtualization Management
NASA Astrophysics Data System (ADS)
Danciu, Vitalian A.; Metzker, Martin G.
The quick adoption of virtualization technology in general and the advent of the Cloud business model entail new requirements on the structure and the configuration of back-end I/O systems. Several approaches to virtualization of I/O links are being introduced, which aim at implementing a more flexible I/O channel configuration without compromising performance. While previously the management of I/O devices could be limited to basic technical requirments (e.g. the establishment and termination of fixed-point links), the additional flexibility carries in its wake additional management requirements on the representation and control of I/O sub-systems.
Noninvasive methods for dynamic mapping of microbial populations across the landscape
NASA Astrophysics Data System (ADS)
Meredith, L. K.; Sengupta, A.; Troch, P. A.; Volkmann, T. H. M.
2017-12-01
Soil microorganisms drive key ecosystem processes, and yet characterizing their distribution and activity in soil has been notoriously difficult. This is due, in part, to the heterogeneous nature of their response to changing environmental and nutrient conditions across time and space. These dynamics are challenging to constrain in both natural and experimental systems because of sampling difficulty and constraints. For example, soil microbial sampling at the Landscape Evolution Observatory (LEO) infrastructure in Biosphere 2 is limited in efforts to minimize soil disruption to the long term experiment that aims to characterize the interacting biological, hydrological, and geochemical processes driving soil evolution. In this and other systems, new methods are needed to monitor soil microbial communities and their genetic potential over time. In this study, we take advantage of the well-defined boundary conditions on hydrological flow at LEO to develop a new method to nondestructively characterize in situ microbial populations. In our approach, we sample microbes from the seepage flow at the base of each of three replicate LEO hillslopes and use hydrological models to `map back' in situ microbial populations. Over the course of a 3-month periodic rainfall experiment we collected samples from the LEO outflow for DNA and extraction and microbial community composition analysis. These data will be used to describe changes in microbial community composition over the course of the experiment. In addition, we will use hydrological flow models to identify the changing source region of discharge water over the course of periodic rainfall pulses, thereby mapping back microbial populations onto their geographic origin in the slope. These predictions of in situ microbial populations will be ground-truthed against those derived from destructive soil sampling at the beginning and end of the rainfall experiment. Our results will show the suitability of this method for long-term, non-destructive monitoring of the microbial communities that contribute to soil evolution in this large-scale model system. Furthermore, this method may be useful for other study systems with limitations to destructive sampling including other model infrastructures and natural landscapes.
Assembly line inspection using neural networks
NASA Astrophysics Data System (ADS)
McAulay, Alastair D.; Danset, Paul; Wicker, Devert W.
1990-09-01
A user friendly flexible system for assembly line part inspection which learns good and bad parts is described. The system detects missing rivets and springs in clutch drivers. The system extracts features in a circular region of interest from a video image processes these using a Fast Fourier Transform for rotation invariance and uses this as inputs to a neural network trained with back-propagation. The advantage of a learning system is that expensive reprogramming and delays are avoided when a part is modified. Two cases were considered. The first one could use back lighting in that surface effects could be ignored. The second case required front lighting because the part had a cover which prevented light from passing through the parts. 100 percent classification of good and bad parts was achieved for both back-lit and front-lit cases with a limited number of training parts available. 1. BACKGROUND A vision system to inspect clutch drivers for missing rivets and springs at the Harrison Radiator Plant of General Motors (GM) works only on parts without covers Fig. 1 and is expensive. The system does not work when there are cover plates Fig. 2 that rule out back light passing through the area of missing rivets and springs. Also the system like all such systems must be reprogrammed at significant time and cost when the system needs to classify a different fault or a
The ATLAS PanDA Monitoring System and its Evolution
NASA Astrophysics Data System (ADS)
Klimentov, A.; Nevski, P.; Potekhin, M.; Wenaus, T.
2011-12-01
The PanDA (Production and Distributed Analysis) Workload Management System is used for ATLAS distributed production and analysis worldwide. The needs of ATLAS global computing imposed challenging requirements on the design of PanDA in areas such as scalability, robustness, automation, diagnostics, and usability for both production shifters and analysis users. Through a system-wide job database, the PanDA monitor provides a comprehensive and coherent view of the system and job execution, from high level summaries to detailed drill-down job diagnostics. It is (like the rest of PanDA) an Apache-based Python application backed by Oracle. The presentation layer is HTML code generated on the fly in the Python application which is also responsible for managing database queries. However, this approach is lacking in user interface flexibility, simplicity of communication with external systems, and ease of maintenance. A decision was therefore made to migrate the PanDA monitor server to Django Web Application Framework and apply JSON/AJAX technology in the browser front end. This allows us to greatly reduce the amount of application code, separate data preparation from presentation, leverage open source for tools such as authentication and authorization mechanisms, and provide a richer and more dynamic user experience. We describe our approach, design and initial experience with the migration process.
Embedded neural recording with TinyOS-based wireless-enabled processor modules.
Farshchi, Shahin; Pesterev, Aleksey; Nuyujukian, Paul; Guenterberg, Eric; Mody, Istvan; Judy, Jack W
2010-04-01
To create a wireless neural recording system that can benefit from the continuous advancements being made in embedded microcontroller and communications technologies, an embedded-system-based architecture for wireless neural recording has been designed, fabricated, and tested. The system consists of commercial-off-the-shelf wireless-enabled processor modules (motes) for communicating the neural signals, and a back-end database server and client application for archiving and browsing the neural signals. A neural-signal-acquisition application has been developed to enable the mote to either acquire neural signals at a rate of 4000 12-bit samples per second, or detect and transmit spike heights and widths sampled at a rate of 16670 12-bit samples per second on a single channel. The motes acquire neural signals via a custom low-noise neural-signal amplifier with adjustable gain and high-pass corner frequency that has been designed, and fabricated in a 1.5-microm CMOS process. In addition to browsing acquired neural data, the client application enables the user to remotely toggle modes of operation (real-time or spike-only), as well as amplifier gain and high-pass corner frequency.
Cropsey, Karen L.; Lane, Peter S.; Hale, Galen J.; Jackson, Dorothy O.; Clark, C. Brendan; Ingersoll, Karen S.; Islam, M. Aminul; Stitzer, Maxine L.
2011-01-01
Aims Recent studies have demonstrated the efficacy of both methadone and buprenorphine when used with opioid dependent men transitioning from prison to the community, but no studies have been conducted with women in the criminal justice (CJ) system. The aim of this study was to determine the efficacy of buprenorphine for relapse prevention among opioid dependent women in the CJ system transitioning back to the community. Methods 36 women under CJ supervision were recruited from an inpatient drug treatment facility that treats CJ individuals returning back to the community. Nine were enrolled in an open label buprenorphine arm then 27 were randomized to buprenorphine (n=15) or placebo (n=12; double-blind). All women completed baseline measures and started study medication prior to release. Participants were followed weekly, provided urine drug screens (UDS), received study medication for 12 weeks, and returned for a 3 month follow-up. Intent-to-treat analyses were performed for all time points through end-of-treatment (EOT). Results The majority of participants were Caucasian (88.9%), young (M±SD=31.8±8.4 years), divorced/separated (59.2%) women with at least a high school/GED education (M±SD =12±1.7 years). GEE analyses showed that buprenorphine was efficacious in maintaining abstinence across time compared to placebo. At End of Treatment, 92% of placebo and 33% of active medication participants were positive for opiates on urine drug screen (Chi-Square = 10.9, df=1; p<0.001). However, by the three month follow-up point, no differences were found between the two groups, with 83% of participants at follow-up positive for opiates. Conclusions Women in the CJ system who received buprenorphine prior to release from a treatment facility had fewer opiate positive UDS through the 12-weeks of treatment compared to women receiving placebo. Initiating buprenorphine in a controlled environment prior to release appears to be a viable strategy to reduce opiate use when transitioning back to the community. PMID:21782352
Collections and user tools for utilization of persistent identifiers in cyberinfrastructures
NASA Astrophysics Data System (ADS)
Weigel, T.
2014-12-01
The main use of persistent identifiers (PIDs) for data objects has so far been for formal publication and citation purposes with a focus on long-term availability and trust. This core use case has now evolved and broadened to include basic data management tasks as identifiers are increasingly seen as a possible anchor element in the deluge of data for purposes of large-scale automation of tasks. The European Data Infrastructure (EUDAT) for instance uses PIDs in their back-end services and distinctly so for entities where the identifier may be more persistent than a resource with limited lifetime. Despite breaking with the traditional metaphor, this offers new opportunities for data management and end-user tools, but also requires a clear demonstrated benefit of value-added services because en masse identifier assignment does not come at zero costs. There are several obstacles to overcome when establishing identifiers at large scale. The administration of large numbers of identifiers can be cumbersome if they are treated in an isolated manner. Here, identifier collections can enable automated mass operations on groups of associated objects. Several use cases rely on base information that is rapidly available from the identifier systems without the need to retrieve objects, yet they will not work efficiently if the information is not consistently typed. Tools that span cyberinfrastructures and address scientific end-users unaware of the varying back-ends must overcome such obstacles. The Working Group on PID Information Types of the Research Data Alliance (RDA) has developed an interface specification and prototype to access and manipulate typed base information. Concrete prototypes for identifier collections exist as well. We will present some first data and provenance tracking tools that make extensive use of these recent developments and address different user needs that span from administrative tasks to individual end-user services with particular focus on data available from the Earth System Grid Federation (ESGF). We will compare the tools along their respective use cases with existing approaches and discuss benefits and limitations.
Detering, Brent A.; Kong, Peter C.
2006-08-29
A fast-quench reactor for production of diatomic hydrogen and unsaturated carbons is provided. During the fast quench in the downstream diverging section of the nozzle, such as in a free expansion chamber, the unsaturated hydrocarbons are further decomposed by reheating the reactor gases. More diatomic hydrogen is produced, along with elemental carbon. Other gas may be added at different stages in the process to form a desired end product and prevent back reactions. The product is a substantially clean-burning hydrogen fuel that leaves no greenhouse gas emissions, and elemental carbon that may be used in powder form as a commodity for several processes.
Kusano, Kristofer D; Gabler, Hampton C
2010-01-01
To mitigate the severity of rear-end and other collisions, Pre-Crash Systems (PCS) are being developed. These active safety systems utilize radar and/or video cameras to determine when a frontal crash, such as a front-to-back rear-end collisions, is imminent and can brake autonomously, even with no driver input. Of these PCS features, the effects of autonomous pre-crash braking are estimated. To estimate the maximum potential for injury reduction due to autonomous pre-crash braking in the striking vehicle of rear-end crashes, a methodology is presented for determining 1) the reduction in vehicle crash change in velocity (ΔV) due to PCS braking and 2) the number of injuries that could be prevented due to the reduction in collision severity. Injury reduction was only performed for belted drivers, as unbelted drivers have an unknown risk of being thrown out of position. The study was based on 1,406 rear-end striking vehicles from NASS / CDS years 1993 to 2008. PCS parameters were selected from realistic values and varied to examine the effect on system performance. PCS braking authority was varied from 0.5 G's to 0.8 G's while time to collision (TTC) was held at 0.45 seconds. TTC was then varied from 0.3 second to 0.6 seconds while braking authority was held constant at 0.6 G's. A constant braking pulse (step function) and ramp-up braking pulse were used. The study found that automated PCS braking could reduce the crash ΔV in rear-end striking vehicles by an average of 12% - 50% and avoid 0% - 14% of collisions, depending on PCS parameters. Autonomous PCS braking could potentially reduce the number of injured drivers who are belted by 19% to 57%.
Wellbore manufacturing processes for in situ heat treatment processes
Davidson, Ian Alexander; Geddes, Cameron James; Rudolf, Randall Lynn; Selby, Bruce Allen; MacDonald, Duncan Charles
2012-12-11
A method includes making coiled tubing at a coiled tubing manufacturing unit coupled to a coiled tubing transportation system. One or more coiled tubing reels are transported from the coiled tubing manufacturing unit to one or more moveable well drilling systems using the coiled tubing transportation system. The coiled tubing transportation system runs from the tubing manufacturing unit to one or more movable well drilling systems, and then back to the coiled tubing manufacturing unit.
Panyard, James; Potter, Timothy; Charron, William; Hopkins, Deborah; Reverdy, Frederic
2010-04-06
A system for ultrasonic profiling of a weld sample includes a carriage movable in opposite first and second directions. An ultrasonic sensor is coupled to the carriage to move over the sample as the carriage moves. An encoder determines the position of the carriage to determine the position of the sensor. A spring is connected at one end of the carriage. Upon the carriage being moved in the first direction toward the spring such that the carriage and the sensor are at a beginning position and the spring is compressed the spring decompresses to push the carriage back along the second direction to move the carriage and the sensor from the beginning position to an ending position. The encoder triggers the sensor to take the ultrasonic measurements of the sample when the sensor is at predetermined positions while the sensor moves over the sample between the beginning and positions.
Apparatus and method for detecting leaks in piping
Trapp, D.J.
1994-12-27
A method and device are disclosed for detecting the location of leaks along a wall or piping system, preferably in double-walled piping. The apparatus comprises a sniffer probe, a rigid cord such as a length of tube attached to the probe on one end and extending out of the piping with the other end, a source of pressurized air and a source of helium. The method comprises guiding the sniffer probe into the inner pipe to its distal end, purging the inner pipe with pressurized air, filling the annulus defined between the inner and outer pipe with helium, and then detecting the presence of helium within the inner pipe with the probe as is pulled back through the inner pipe. The length of the tube at the point where a leak is detected determines the location of the leak in the pipe. 2 figures.
Lee, Kyung-Min; Armstrong, Paul R; Thomasson, J Alex; Sui, Ruixiu; Casada, Mark; Herrman, Timothy J
2010-10-27
Tracing grain from the farm to its final processing destination as it moves through multiple grain-handling systems, storage bins, and bulk carriers presents numerous challenges to existing record-keeping systems. This study examines the suitability of coded caplets to trace grain, in particular, to evaluate methodology to test tracers' ability to withstand the rigors of a commercial grain handling and storage systems as defined by physical properties using measurement technology commonly applied to assess grain hardness and end-use properties. Three types of tracers to dispense into bulk grains for tracing the grain back to its field of origin were developed using three food-grade substances [processed sugar, pregelatinized starch, and silicified microcrystalline cellulose (SMCC)] as a major component in formulations. Due to a different functionality of formulations, the manufacturing process conditions varied for each tracer type, resulting in unique variations in surface roughness, weight, dimensions, and physical and spectroscopic properties before and after coating. The applied two types of coating [pregelatinized starch and hydroxypropylmethylcellulose (HPMC)] using an aqueous coating system containing appropriate plasticizers showed uniform coverage and clear coating. Coating appeared to act as a barrier against moisture penetration, to protect against mechanical damage of the surface of the tracers, and to improve the mechanical strength of tracers. The results of analysis of variance (ANOVA) tests showed the type of tracer, coating material, conditioning time, and a theoretical weight gain significantly influenced the morphological and physical properties of tracers. Optimization of these factors needs to be pursued to produce desirable tracers with consistent quality and performance when they flow with bulk grains throughout the grain marketing channels.
Bringing Knowledge Back In: Perspectives from Liberal Education
ERIC Educational Resources Information Center
Deng, Zongyi
2018-01-01
From the vantage point of liberal education, this article attempts to contribute to the conversation initiated by Michael Young and his colleagues on 'bringing knowledge back' into the current global discourse on curriculum policy and practice. The contribution is made through revisiting the knowledge-its-own-end thesis associated with Newman and…
Nematodes ultrastructure: complex systems and processes.
Basyoni, Maha M A; Rizk, Enas M A
2016-12-01
Nematode worms are among the most ubiquitous organisms on earth. They include free-living forms as well as parasites of plants, insects, humans and other animals. Recently, there has been an explosion of interest in nematode biology, including the area of nematode ultrastructure. Nematodes are round with a body cavity. They have one way guts with a mouth at one end and an anus at the other. They have a pseudocoelom that is lined on one side with mesoderm and on the other side with endoderm. It appears that the cuticle is a very complex and evolutionarily plastic feature with important functions involving protection, body movement and maintaining shape. They only have longitudinal muscles so; they seem to thrash back and forth. While nematodes have digestive, reproductive, nervous and excretory systems, they do not have discrete circulatory or respiratory systems. Nematodes use chemosensory and mechanosensory neurons embedded in the cuticle to orient and respond to a wide range of environmental stimuli. Adults are made up of roughly 1000 somatic cells and hundreds of those cells are typically associated with the reproductive systems. Nematodes ultrastructure seeks to provide studies which enable their use as models for diverse biological processes including; human diseases, immunity, host-parasitic interactions and the expression of phylogenomics. The latter has, however, not been brought into a single inclusive entity. Consequently, in the current review we tried to provide a comprehensive approach to the current knowledge available for nematodes ultrastructures.
Duregger, Katharina; Hayn, Dieter; Nitzlnader, Michael; Kropf, Martin; Falgenhauer, Markus; Ladenstein, Ruth; Schreier, Günter
2016-01-01
Electronic Patient Reported Outcomes (ePRO) gathered using telemonitoring solutions might be a valuable source of information in rare cancer research. The objective of this paper was to develop a concept and implement a prototype for introducing ePRO into the existing neuroblastoma research network by applying Near Field Communication and mobile technology. For physicians, an application was developed for registering patients within the research network and providing patients with an ID card and a PIN for authentication when transmitting telemonitoring data to the Electronic Data Capture system OpenClinica. For patients, a previously developed telemonitoring system was extended by a Simple Object Access Protocol (SOAP) interface for transmitting nine different health parameters and toxicities. The concept was fully implemented on the front-end side. The developed application for physicians was prototypically implemented and the mobile application of the telemonitoring system was successfully connected to OpenClinica. Future work will focus on the implementation of the back-end features.
Normative data on the n-back task for children and young adolescents
Pelegrina, Santiago; Lechuga, M. Teresa; García-Madruga, Juan A.; Elosúa, M. Rosa; Macizo, Pedro; Carreiras, Manuel; Fuentes, Luis J.; Bajo, M. Teresa
2015-01-01
The n-back task is a frequently used measure of working memory (WM) in cognitive neuroscience research contexts, and it has become widely adopted in other areas over the last decade. This study aimed to obtain normative data for the n-back task from a large sample of children and adolescents. To this end, a computerized verbal n-back task with three levels of WM load (1-back, 2-back, and 3-back) was administered to 3722 Spanish school children aged 7–13 years. Results showed an overall age-related increase in performance for the different levels of difficulty. This trend was less pronounced at 1-back than at 2-back when hits were considered. Gender differences were also observed, with girls outperforming boys although taking more time to respond. The theoretical implications of these results are discussed. Normative data stratified by age and gender for the three WM load levels are provided. PMID:26500594
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for collecting, on a periodic basis, raw seismic records from the European seismic networks, (2) high-performance noise source mapping application responsible for generation of source maps using cross-correlation of seismic records, (3) back-end infrastructure for the coordination of various tasks and computations, (4) front-end Web interface providing the service to the end-users and (5) data repository. The noise mapping application is composed of four principal modules: (1) pre-processing of raw data, (2) massive cross-correlation, (3) post-processing of correlation data based on computation of logarithmic energy ratio and (4) generation of source maps from post-processed data. Implementation of the solution posed various challenges, in particular, selection of data sources and transfer protocols, automation and monitoring of daily data downloads, ensuring the required data processing performance, design of a general service oriented architecture for coordination of various sub-systems, and engineering an appropriate data storage solution. The present pilot version of the service implements noise source maps for Switzerland. Extension of the solution to Central Europe is planned for the next project phase.
Visualization for Hyper-Heuristics: Back-End Processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simon, Luke
Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualizationmore » for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.« less
Coplanar back contacts for thin silicon solar cells
NASA Technical Reports Server (NTRS)
Thornhill, J. W.; Sipperly, W. E.
1980-01-01
A process for fabricating 2 to 3 mil wraparound solar cells was formulated. Sample thin wraparound cells were fabricated using this process. The process used a reinforced perimeter construction to reduce the breakage that occurs during handling of the wafers. A retracting piston post was designed and fabricated to help minimize the breakage that occurs during the screen printing process. Two alternative methods of applying the aluminum back surface field were investigated. In addition to the standard screen printed back surface field, both spin-on and evaporated aluminum techniques were researched. Neither spin-on nor evaporated aluminum made any noticeable improvement over the screen printing technique. A fine screen mesh was chosen for the application of the aluminum paste back surface field. The optimum time and temperature for firing the aluminum turned out to be thirty seconds at 850 C. The development work on the dielectric included looking at three dielectrics for the wraparound application. Transene 1000, Thick Film Systems 1126RCB and an in house formulation 61-2-2A were all tested. Cells with pre-dielectric thickness of 3.0-0-3.5 mils using Transene 1000 as the wraparound dielectric and the procedure outlined above showed an average efficiency of 10.7 percent. Thinner cells were fabricated, but had an unacceptable yield and efficiency.
2015-08-28
This dramatic view of the Pluto system is as NASA's New Horizons spacecraft saw it in July 2015. The animation, made with real images taken by New Horizons, begins with Pluto flying in for its close-up on July 14; we then pass behind Pluto and see the atmosphere glow in sunlight before the sun passes behind Pluto's largest moon, Charon. The movie ends with New Horizons' departure, looking back on each body as thin crescents. http://photojournal.jpl.nasa.gov/catalog/PIA19873
View of Pilot Gregory Johnson working on the Flight Deck
2009-05-21
S125-E-013042 (21 May 2009) --- Occupying the commander?s station, astronaut Gregory C. Johnson, STS-125 pilot, uses the Portable In-Flight Landing Operations Trainer (PILOT) on the flight deck of the Earth-orbiting Space Shuttle Atlantis. PILOT consists of a laptop computer and a joystick system, which helps to maintain a high level of proficiency for the end-of-mission approach and landing tasks required to bring the shuttle safely back to Earth. Astronaut Scott Altman, commander, looks on.
Low power multi-camera system and algorithms for automated threat detection
NASA Astrophysics Data System (ADS)
Huber, David J.; Khosla, Deepak; Chen, Yang; Van Buer, Darrel J.; Martin, Kevin
2013-05-01
A key to any robust automated surveillance system is continuous, wide field-of-view sensor coverage and high accuracy target detection algorithms. Newer systems typically employ an array of multiple fixed cameras that provide individual data streams, each of which is managed by its own processor. This array can continuously capture the entire field of view, but collecting all the data and back-end detection algorithm consumes additional power and increases the size, weight, and power (SWaP) of the package. This is often unacceptable, as many potential surveillance applications have strict system SWaP requirements. This paper describes a wide field-of-view video system that employs multiple fixed cameras and exhibits low SWaP without compromising the target detection rate. We cycle through the sensors, fetch a fixed number of frames, and process them through a modified target detection algorithm. During this time, the other sensors remain powered-down, which reduces the required hardware and power consumption of the system. We show that the resulting gaps in coverage and irregular frame rate do not affect the detection accuracy of the underlying algorithms. This reduces the power of an N-camera system by up to approximately N-fold compared to the baseline normal operation. This work was applied to Phase 2 of DARPA Cognitive Technology Threat Warning System (CT2WS) program and used during field testing.
A miniature bidirectional telemetry system for in vivo gastric slow wave recordings.
Farajidavar, Aydin; O'Grady, Gregory; Rao, Smitha M N; Cheng, Leo K; Abell, Thomas; Chiao, J-C
2012-06-01
Stomach contractions are initiated and coordinated by an underlying electrical activity (slow waves), and electrical dysrhythmias accompany motility diseases. Electrical recordings taken directly from the stomach provide the most valuable data, but face technical constraints. Serosal or mucosal electrodes have cables that traverse the abdominal wall, or a natural orifice, causing discomfort and possible infection, and restricting mobility. These problems motivated the development of a wireless system. The bidirectional telemetric system constitutes a front-end transponder, a back-end receiver and a graphical userinter face. The front-end module conditions the analogue signals, then digitizes and loads the data into a radio for transmission. Data receipt at the backend is acknowledged via a transceiver function. The system was validated in a bench-top study, then validated in vivo using serosal electrodes connected simultaneously to a commercial wired system. The front-end module was 35 × 35 × 27 mm3 and weighed 20 g. Bench-top tests demonstrated reliable communication within a distance range of 30 m, power consumption of 13.5 mW, and 124 h operation when utilizing a 560 mAh, 3 V battery. In vivo,slow wave frequencies were recorded identically with the wireless and wired reference systems (2.4 cycles min−1), automated activation time detection was modestly better for the wireless system (5% versus 14% FP rate), and signal amplitudes were modestly higher via the wireless system (462 versus 3 86μV; p<0.001). This telemetric system for slow wave acquisition is reliable,power efficient, readily portable and potentially implantable. The device will enable chronic monitoring and evaluation of slow wave patterns in animals and patients.0967-3334/
UNDERGROUND URANIUM MINING ON COLORADO PLATEAU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dare, W.L.
1958-10-31
The size and continuity of the Chinie ore bodies in the Big Indian district, Utah, have permitted mine operators in plan a more integrated development and mining system using larger and more specialized equipment. Thick ore and firm backs at the south end of the district than permitted room and pillar mining, using large drill jumbos send diesel-powered haulage equipment. The Gismo loader and draw-chute system has proved efficient. Driving the haulage- way below the stope level is an advantage when pillars are recovered. To the north, thinner ore with weaker backs favor retreat systems and smaller equipment. Here, themore » ore bodies are delineated by a grid system of drifts, send the ore recovered by panel, longwall, or similar mining methods, retreating toward the principal entry. Labor productivity ranges from 8 to 21 tons per man-shift, send direct mining send development costs, excluding initial development, ranges from 75 to 51 per ton. A unique system of mine development is in the Temple Mountain district, Utah, where the shallow Chinie deposits are mined through 36- inch diameter calyx drill holes. Using small diesel-powered ore buggies and bucket hoisting, ore in produced from the two largest mines at a rate of 4.1 tons per man-shift, at a direci cost of 15 a ton. Ambrosia Lake deposits range from 5 to 80 feet thick and occur from 350 to 1,000 feet below the surface. These mines are in development stages. Open, retreat, and top-slice sloping is planned. Adequate ventilation is essential in uranium mining since sufficient air must be coursed through the workings to maintain airborne radioactive concentration at tolerance levels send dilute exhaust gases where diesel-powered equipment is used. Uranium miners have found that radiometric scannning is a quick and efficient method for checking ths grade of the ore produced and in process of development. (auth)« less
Back pain is associated with changes in loading pattern throughout forward and backward bending.
Shum, Gary L K; Crosbie, Jack; Lee, Raymond Y W
2010-12-01
Experimental study to determine the kinetics of the lumbar spine (LS) and hips during forward and backward bending. To investigate the effects of back pain, with and without a positive straight leg raise (SLR) sign, on the loading patterns in the LS and hip during forward and backward bending. Forward and backward bending are important components of many functional activities and are part of routine clinical examination. However, there is a little information about the loading patterns during forward and backward bending in people with back pain with or without a positive SLR sign. Twenty asymptomatic participants, 20 back pain participants, and 20 participants with back pain and a positive SLR sign performed 3 continuous cycles of forward and backward bending. Electromagnetic sensors were attached to body segments to measure their kinematics while 2 nonconductive force plates gathered ground reaction force data. A biomechanical model was used to determine the loading pattern in LS and hips. Although the loading on the LS at the end of the range decreased significantly, the loading at the early and middle ranges of forward bending actually increased significantly in people with back pain, especially in those with positive SLR sign. This suggests that resistance to movement is significantly increased in people with back pain during this movement. This study suggested that it is not sufficient to study the spine at the end of range only, but a complete description of the loading patterns throughout the range is required. Although the maximum range of motion of the spine is reduced in people with back pain, there is a significant increase in the moment acting through the range, particularly in those with a positive SLR sign.
Newman, M; Newman, R; Hughes, T; Vadher, K; Barker, K L
2018-04-01
Timed loaded standing (TLS) is a suggested measure of back muscle endurance for people with vertebral osteoporosis. Surface electromyography revealed back muscles work harder and fatigue during TLS. The test end-point and total time were associated with back fatigue. The findings help demonstrate the concurrent validity of the TLS test. The TLS test is suggested as a measure of back muscle endurance for patients with vertebral osteoporosis. However, to date, no study has demonstrated that TLS does measure back extensor or erector spinae (ES) muscle endurance. We used surface electromyography (sEMG) to investigate the performance of the thoracic ES muscles during TLS. Thirty-six people with vertebral osteoporosis with a mean age of 71.6 (range 45-86) years participated. sEMG recordings were made of the ES at T3 and T12 bilaterally during quiet standing (QS) and TLS. The relative (%) change in sEMG amplitude between conditions was compared. Fatigue was evaluated by analysing the change in median frequency (MF) of the sEMG signal during TLS, and the correlation between maximal TLS time and rate of MF decline was examined. Activity in the ES increased significantly during TLS at all electrode locations. During TLS, the MF declined at a mean rate of -24.2% per minute (95% C.I. -26.5 to -21.9%). The MF slope and test time were strongly correlated (r 2 = 0.71), and at test end, the final MF dropped to an average 89% (95% C.I. 85 to 93%) of initial MF. Twenty-eight participants (78%) reported fatigue was the main reason for stopping, and for eight (22%), it was pain. This study demonstrates that TLS challenges the ES muscles in the thoracic region and results in ES fatigue. Endurance time and the point at which the TLS test ends are strongly related to ES fatigue.
NASA Technical Reports Server (NTRS)
Kelly, W. L.; Howle, W. M.; Meredith, B. D.
1980-01-01
The Information Adaptive System (IAS) is an element of the NASA End-to-End Data System (NEEDS) Phase II and is focused toward onbaord image processing. Since the IAS is a data preprocessing system which is closely coupled to the sensor system, it serves as a first step in providing a 'Smart' imaging sensor. Some of the functions planned for the IAS include sensor response nonuniformity correction, geometric correction, data set selection, data formatting, packetization, and adaptive system control. The inclusion of these sensor data preprocessing functions onboard the spacecraft will significantly improve the extraction of information from the sensor data in a timely and cost effective manner and provide the opportunity to design sensor systems which can be reconfigured in near real time for optimum performance. The purpose of this paper is to present the preliminary design of the IAS and the plans for its development.
Grapple fixture for use with electromagnetic attachment mechanism
NASA Technical Reports Server (NTRS)
Monford, Jr., Leo G. (Inventor)
1995-01-01
An electromagnetic attachment mechanism for use as an end effector of a remote manipulator system. A pair of electromagnets 15A,15B, each with a U-shaped magnetic core with a pull-in coil 34 and two holding coils 35,36 are mounted by a spring suspension system 38,47 on a base plate 25 of the mechanism housing 30 with end pole pieces 21,22 adapted to move through openings in the base plate when the attractive force of the electromagnets is exerted on a strike plate 65 of a grapple fixture 20 affixed to a target object 14. The pole pieces are spaced by an air gap from the strike plate when the mechanism first contacts the grapple fixture. An individual control circuit and power source is provided for the pull-in coil and one holding coil of each electromagnet. A back-up control circuit connected to the two power sources and a third power source is provided for the remaining holding coils. When energized, the pull-in coils overcome the suspension system and air gap and are automatically de-energized when the pole pieces move to grapple and impose a preload force across the grapple interface. A battery back-up 89A,89B is a redundant power source for each electromagnet in each individual control circuit and is automatically connected upon failure of the primary power source. A centerline mounted camera 31 and video monitor 70 are used in cooperation with a target pattern 19 on the reflective surface 67 of the strike plate to effect targeting and alignment.
Zhu, Xinjie; Zhang, Qiang; Ho, Eric Dun; Yu, Ken Hung-On; Liu, Chris; Huang, Tim H; Cheng, Alfred Sze-Lok; Kao, Ben; Lo, Eric; Yip, Kevin Y
2017-09-22
A genomic signal track is a set of genomic intervals associated with values of various types, such as measurements from high-throughput experiments. Analysis of signal tracks requires complex computational methods, which often make the analysts focus too much on the detailed computational steps rather than on their biological questions. Here we propose Signal Track Query Language (STQL) for simple analysis of signal tracks. It is a Structured Query Language (SQL)-like declarative language, which means one only specifies what computations need to be done but not how these computations are to be carried out. STQL provides a rich set of constructs for manipulating genomic intervals and their values. To run STQL queries, we have developed the Signal Track Analytical Research Tool (START, http://yiplab.cse.cuhk.edu.hk/start/ ), a system that includes a Web-based user interface and a back-end execution system. The user interface helps users select data from our database of around 10,000 commonly-used public signal tracks, manage their own tracks, and construct, store and share STQL queries. The back-end system automatically translates STQL queries into optimized low-level programs and runs them on a computer cluster in parallel. We use STQL to perform 14 representative analytical tasks. By repeating these analyses using bedtools, Galaxy and custom Python scripts, we show that the STQL solution is usually the simplest, and the parallel execution achieves significant speed-up with large data files. Finally, we describe how a biologist with minimal formal training in computer programming self-learned STQL to analyze DNA methylation data we produced from 60 pairs of hepatocellular carcinoma (HCC) samples. Overall, STQL and START provide a generic way for analyzing a large number of genomic signal tracks in parallel easily.
Fold-Back: Using Emerging Technologies to Move from Quality Assurance to Quality Enhancement
ERIC Educational Resources Information Center
Leonard, Simon N.; Fitzgerald, Robert N.; Bacon, Matt
2016-01-01
Emerging technologies offer an opportunity for the development, at the institutional level, of quality processes with greater capacity to enhance learning in higher education than available through current quality processes. These systems offer the potential to extend use of learning analytics in institutional-level quality processes in addition…
White Light Optical Information Processing.
1985-05-31
together) incident on the nematic film , after passage through the opti- cal system, was about 0.2 watts. A second beam splitter BSI was placed between... film , a process that is like holography, indeed is often termed image-plane holography, but in fact goes back 0 to Ives.5 In particular, the use of...slit images became straight, whereupon the system was assumed . to be properly adjusted. For the real time, or phase conjugation process, a thin film
Lavender, Steven A; Lorenz, Eric P; Andersson, Gunnar B J
2007-02-15
A prospective randomized control trial. To determine the degree to which a new behavior-based lift training program (LiftTrainer; Ascension Technology, Burlington, VT) could reduce the incidence of low back disorder in distribution center jobs that require repetitive lifting. Most studies show programs aimed at training lifting techniques to be ineffective in preventing low back disorders, which may be due to their conceptual rather than behavioral learning approach. A total of 2144 employees in 19 distribution centers were randomized into either the LiftTrainer program or a video control group. In the LiftTrainer program, participants were individually trained in up to 5, 30-minute sessions while instrumented with motion capture sensors to quantify the L5/S1 moments. Twelve months following the initial training, injury data were obtained from company records. Survival analyses (Kaplan-Meier) indicated that there was no difference in injury rates between the 2 training groups. Likewise, there was no difference in the turnover rates. However, those with a low (<30 Nm) average twisting moment at the end of the first session experienced a significantly (P < 0.005) lower rate of low back disorder than controls. While overall the LiftTrainer program was not effective, those with twisting moments below 30 Nm reported fewer injuries, suggesting a shift in focus for "safe" lifting programs.
Development of advanced silicon solar cells for Space Station Freedom
NASA Technical Reports Server (NTRS)
Lillington, David R.
1990-01-01
This report describes the development of large area high efficiency wrapthrough solar cells for Space Station Freedom. The goal of this contract was the development and fabrication of 8 x 8 cm coplanar back contact solar cells with a minimum output of 1.039 watts/cell. The first task in this program was a modeling study to determine the optimum configuration of the cell and to study the effects of surface passivation, substrate resistivity, and back surface field on the BOL and EOL performance. In addition, the optical stack, including the cell cover, AR coatings, and Kapton blanket, was modeled to optimize 'on orbit' operation. The second phase was a manufacturing development phase to develop high volume manufacturing processes for the reliable production of low recombination velocity boron back surface fields, techniques to produce smooth, low leakage wrapthrough holes, passivation, photoresist application methods, and metallization schemes. The final portion of this program was a pilot production phase. Seven hundred solar cells were delivered in this phase. At the end of the program, cells with average efficiencies over 13 percent were being produced with power output in excess of 1.139 watts/cell, thus substantially exceeding the program goal.
Accessing Cloud Properties and Satellite Imagery: A tool for visualization and data mining
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.
2016-12-01
Providing public access to imagery of cloud macro and microphysical properties and the underlying satellite imagery is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and system that allows end users to easily browse cloud information and satellite imagery that is otherwise difficult to acquire and manipulate. The tool has two uses, one to visualize the data and the other to access the data directly. It uses a widely used access protocol, the Open Geospatial Consortium's Web Map and Processing Services, to encourage user to access the data we produce. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud. One goal of the tool is to provide a demonstration of the back end capability to end users so that they can use the dynamically generated imagery and data as an input to their own work flows or to set up data mining constraints. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information and satellite imagery accessible and easily searchable. Increasingly, information is used in a "mash-up" form where multiple sources of information are combined to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much cutting edge scientific knowledge, observations and products available to the citizen science, research and interested communities for these kinds of "mash-ups" as well as provide a means for automated systems to data mine our information. This tool and access method provides a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.
Fatigue and workload in short and long-haul train driving.
Kazemi, Zeinab; Mazloumi, Adel; Nasl Saraji, Gabraeil; Barideh, Sedighe
2016-06-08
Little has been investigated regarding the role of shift schedule on fatigue and workload among Iranian train drivers. This study sought to compare train drivers' fatigue and workload between a long-haul and a short-haul train trips. One-hundred train drivers, in two routes of Tehran-Mashhad (long-haul) and Tehran-Semnan (short-haul), were asked to complete the Samn-Perelli Fatigue Scale prior to departure, immediately after ending driving duty on the way going, and immediately after ending driving on the way back. Moreover, they were asked to complete NASA-TLX at the end of their shift. Accordingly, understudy train drivers stated relatively similar levels of fatigue and workload for the two trips. Furthermore, fatigue scores were significantly higher at the end of shifts in both routes. Overall, train drivers in long-haul trips had longer rest hours between the ways going and coming back, which seems to compensate for the side effects of longer driving durations.
Osterling, Kathy Lemon; D'Andrade, Amy; Austin, Michael J
2008-01-01
Racial/ethnic disproportionality in the child welfare system is a complicated social problem that is receiving increasing amounts of attention from researchers and practitioners. This review of the literature examines disproportionality in the front-end of the child welfare system and interventions that may address it. While none of the interventions had evidence suggesting that they reduced disproportionality in child welfare front-end processes, some of the interventions may improve child welfare case processes related to disproportionality and outcomes for families of color.
Lean management systems: creating a culture of continuous quality improvement.
Clark, David M; Silvester, Kate; Knowles, Simon
2013-08-01
This is the first in a series of articles describing the application of Lean management systems to Laboratory Medicine. Lean is the term used to describe a principle-based continuous quality improvement (CQI) management system based on the Toyota production system (TPS) that has been evolving for over 70 years. Its origins go back much further and are heavily influenced by the work of W Edwards Deming and the scientific method that forms the basis of most quality management systems. Lean has two fundamental elements--a systematic approach to process improvement by removing waste in order to maximise value for the end-user of the service and a commitment to respect, challenge and develop the people who work within the service to create a culture of continuous improvement. Lean principles have been applied to a growing number of Healthcare systems throughout the world to improve the quality and cost-effectiveness of services for patients and a number of laboratories from all the pathology disciplines have used Lean to shorten turnaround times, improve quality (reduce errors) and improve productivity. Increasingly, models used to plan and implement large scale change in healthcare systems, including the National Health Service (NHS) change model, have evidence-based improvement methodologies (such as Lean CQI) as a core component. Consequently, a working knowledge of improvement methodology will be a core skill for Pathologists involved in leadership and management.
Hathaway, Thomas J.
1979-01-01
This invention provides a housing containing a rotatable coal bucket that is sealed at its ends in the housing with a reciprocal plunger that is sealed in the bucket at one end and has an opposite cone-shaped end that wedges up against a closed end of the bucket, and a method for feeding dry, variable size coal from an ambient atmosphere at low pressure into a high temperature, high pressure reactor between the seals for producing fuel gas substantially without losing any high pressure gas from the reactor or excessively wearing the seals. To this end, the piston biases the plunger back and forth for loading and unloading the bucket with coal along an axis that is separated from the seals, the bucket is rotated to unload the coal into the reactor so as to fill the bucket with trapped high pressure gas from the reactor while preventing the gas from escaping therefrom, and then the cone-shaped plunger end is wedged into mating engagement with the closed end of the bucket to displace this high pressure bucket gas by expelling it back into the reactor whereby the bucket can be re-rotated for filling it with coal again substantially without losing any of the high pressure gas or excessively wearing the seals.
Web-based reactive transport modeling using PFLOTRAN
NASA Astrophysics Data System (ADS)
Zhou, H.; Karra, S.; Lichtner, P. C.; Versteeg, R.; Zhang, Y.
2017-12-01
Actionable understanding of system behavior in the subsurface is required for a wide spectrum of societal and engineering needs by both commercial firms and government entities and academia. These needs include, for example, water resource management, precision agriculture, contaminant remediation, unconventional energy production, CO2 sequestration monitoring, and climate studies. Such understanding requires the ability to numerically model various coupled processes that occur across different temporal and spatial scales as well as multiple physical domains (reservoirs - overburden, surface-subsurface, groundwater-surface water, saturated-unsaturated zone). Currently, this ability is typically met through an in-house approach where computational resources, model expertise, and data for model parameterization are brought together to meet modeling needs. However, such an approach has multiple drawbacks which limit the application of high-end reactive transport codes such as the Department of Energy funded[?] PFLOTRAN code. In addition, while many end users have a need for the capabilities provided by high-end reactive transport codes, they do not have the expertise - nor the time required to obtain the expertise - to effectively use these codes. We have developed and are actively enhancing a cloud-based software platform through which diverse users are able to easily configure, execute, visualize, share, and interpret PFLOTRAN models. This platform consists of a web application and available on-demand HPC computational infrastructure. The web application consists of (1) a browser-based graphical user interface which allows users to configure models and visualize results interactively, and (2) a central server with back-end relational databases which hold configuration, data, modeling results, and Python scripts for model configuration, and (3) a HPC environment for on-demand model execution. We will discuss lessons learned in the development of this platform, the rationale for different interfaces, implementation choices, as well as the planned path forward.
Online data handling and storage at the CMS experiment
NASA Astrophysics Data System (ADS)
Andre, J.-M.; Andronidis, A.; Behrens, U.; Branson, J.; Chaze, O.; Cittolin, S.; Darlea, G.-L.; Deldicque, C.; Demiragli, Z.; Dobson, M.; Dupont, A.; Erhan, S.; Gigi, D.; Glege, F.; Gómez-Ceballos, G.; Hegeman, J.; Holzner, A.; Jimenez-Estupiñán, R.; Masetti, L.; Meijers, F.; Meschi, E.; Mommsen, RK; Morovic, S.; Nuñez-Barranco-Fernández, C.; O'Dell, V.; Orsini, L.; Paus, C.; Petrucci, A.; Pieri, M.; Racz, A.; Roberts, P.; Sakulin, H.; Schwick, C.; Stieger, B.; Sumorok, K.; Veverka, J.; Zaza, S.; Zejdl, P.
2015-12-01
During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ∼62 sources produced with an aggregate rate of ∼2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.
Online Data Handling and Storage at the CMS Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andre, J. M.; et al.
2015-12-23
During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced bymore » the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.« less
The Slow Controls System of the New Muon g-2 Experiment at Fermilab
NASA Astrophysics Data System (ADS)
Eads, Michael; New Muon g-2 Collaboration
2015-04-01
The goal of the new muon g-2 experiment (E-989), currently under construction at Fermi National Accelerator Laboratory, is to measure the anomalous gyromagnetic ratio of the muon with unprecedented precision. The uncertainty goal of the experiment, 0.14ppm, represents a four-fold improvement over the current best measurement of this value and has the potential to increase the current three standard deviation disagreement with the predicted standard model value to five standard deviations. Measuring the operating conditions of the experiment will be essential to achieving these uncertainty goals. This talk will describe the design and the current status of E-989's slow controls system. This system, based on the MIDAS Slow Control Bus, will be used to measure and record currents, voltages, temperatures, humidities, pressures, flows, and other data which is collected asynchronously with the injection of the muon beam. The system consists of a variety of sensors and front-end electronics which interface to back-end data acquisition, data storage, and data monitoring systems. Parts of the system are all already operational and the full system will be completed before beam commissioning begins in 2017.
NASA Astrophysics Data System (ADS)
Manan, Hidayah; Moh, Julia Hwei Zhong; Kasan, Nor Azman; Suratman, Suhaimi; Ikhwanuddin, Mhd
2017-09-01
Study on the microscopic composition of biofloc in closed hatchery culture system was carried out to determine the interaction between the aggregation flocs in the bioremediation process for the decomposition and degradation of organic matter loaded in the shrimp culture tanks. The study was done for 105 days of culture period in zero water exchange. All of the organic loaded in the culture tanks identified comes from the shrimp feces, uneaten fed, and the decomposed macro- and microorganisms died in the culture tanks. All of the microscopic organisms in the biofloc were identified using Advance microscopes Nikon 80i. From the present study, there were abundances and high varieties of phytoplankton, zooplankton, protozoa, nematodes and algae species identified as aggregates together in the flocs accumulation. All of these microscopic organisms identified implemented the symbiotic process together for food supply, become the algae grazer, act as natural water stabilizer in regulating the nutrients in culture tank and serve as decomposer for dead organic matter in the water environment. Heterotrophic bacteria identified from Pseudomonas and Aeromonas family consumed the organic matter loaded at the bottom of culture tank and converted items through chemical process as useful protein food to be consumed back by the shrimp. Overall it can be concluded that the biofloc organisms identified really contributed as natural bioremediation agents in zero water exchange culture system to ensure the water quality in the optimal condition until the end of culture period.
Operational Reserve: National Guard Readiness when Current Conflicts End
2010-03-01
toothpaste back in the tube”17 With probable post war reduction in DOD funding, it is not realistic to assume that the National Guard will obtain...necessitates that we don’t try to put the toothpaste back in the tube. We cannot undo the policies and procedures that have gotten us to the current state
NASA Technical Reports Server (NTRS)
Leete, II, John H. (Inventor); Skinner, William J. (Inventor)
1989-01-01
Structural members for a frame or truss are formed of two substantially identical elongated tubular like members each having a planar extending wall with the tubular members joined together along the planar wall, back to back, and end portions of the joined planar walls form extending lugs. A complementary joint fitting includes a clevis having a slot for receiving the lug.
How smart is your BEOL? productivity improvement through intelligent automation
NASA Astrophysics Data System (ADS)
Schulz, Kristian; Egodage, Kokila; Tabbone, Gilles; Garetto, Anthony
2017-07-01
The back end of line (BEOL) workflow in the mask shop still has crucial issues throughout all standard steps which are inspection, disposition, photomask repair and verification of repair success. All involved tools are typically run by highly trained operators or engineers who setup jobs and recipes, execute tasks, analyze data and make decisions based on the results. No matter how experienced operators are and how good the systems perform, there is one aspect that always limits the productivity and effectiveness of the operation: the human aspect. Human errors can range from seemingly rather harmless slip-ups to mistakes with serious and direct economic impact including mask rejects, customer returns and line stops in the wafer fab. Even with the introduction of quality control mechanisms that help to reduce these critical but unavoidable faults, they can never be completely eliminated. Therefore the mask shop BEOL cannot run in the most efficient manner as unnecessary time and money are spent on processes that still remain labor intensive. The best way to address this issue is to automate critical segments of the workflow that are prone to human errors. In fact, manufacturing errors can occur for each BEOL step where operators intervene. These processes comprise of image evaluation, setting up tool recipes, data handling and all other tedious but required steps. With the help of smart solutions, operators can work more efficiently and dedicate their time to less mundane tasks. Smart solutions connect tools, taking over the data handling and analysis typically performed by operators and engineers. These solutions not only eliminate the human error factor in the manufacturing process but can provide benefits in terms of shorter cycle times, reduced bottlenecks and prediction of an optimized workflow. In addition such software solutions consist of building blocks that seamlessly integrate applications and allow the customers to use tailored solutions. To accommodate for the variability and complexity in mask shops today, individual workflows can be supported according to the needs of any particular manufacturing line with respect to necessary measurement and production steps. At the same time the efficiency of assets is increased by avoiding unneeded cycle time and waste of resources due to the presence of process steps that are very crucial for a given technology. In this paper we present details of which areas of the BEOL can benefit most from intelligent automation, what solutions exist and the quantification of benefits to a mask shop with full automation by the use of a back end of line model.
Compact, Miniature MMIC Receiver Modules for an MMIC Array Spectrograph
NASA Technical Reports Server (NTRS)
Kangaslahti, Pekka P.; Gaier, Todd C.; Cooperrider, Joelle T.; Samoska, Lorene A.; Soria, Mary M.; ODwyer, Ian J.; Weinreb, Sander; Custodero, Brian; Owen, Heahter; Grainge, Keith;
2009-01-01
A single-pixel prototype of a W-band detector module with a digital back-end was developed to serve as a building block for large focal-plane arrays of monolithic millimeter-wave integrated circuit (MMIC) detectors. The module uses low-noise amplifiers, diode-based mixers, and a WR10 waveguide input with a coaxial local oscillator. State-of-the-art InP HEMT (high electron mobility transistor) MMIC amplifiers at the front end provide approximately 40 dB of gain. The measured noise temperature of the module, at an ambient temperature of 300 K, was found to be as low as 450 K at 95 GHz. The modules will be used to develop multiple instruments for astrophysics radio telescopes, both on the ground and in space. The prototype is being used by Stanford University to characterize noise performance at cryogenic temperatures. The goal is to achieve a 30-50 K noise temperature around 90 GHz when cooled to a 20 K ambient temperature. Further developments include characterization of the IF in-phase (I) and quadrature (Q) signals as a function of frequency to check amplitude and phase; replacing the InP low-noise amplifiers with state-of-the-art 35-nm-gate-length NGC low-noise amplifiers; interfacing the front-end module with a digital back-end spectrometer; and developing a scheme for local oscillator and IF distribution in a future array. While this MMIC is being developed for use in radio astronomy, it has the potential for use in other industries. Applications include automotive radar (both transmitters and receivers), communication links, radar systems for collision avoidance, production monitors, ground-penetrating sensors, and wireless personal networks.
Leveraging Open Standards and Technologies to Search and Display Planetary Image Data
NASA Astrophysics Data System (ADS)
Rose, M.; Schauer, C.; Quinol, M.; Trimble, J.
2011-12-01
Mars and the Moon have both been visited by multiple NASA spacecraft. A large number of images and other data have been gathered by the spacecraft and are publicly available in NASA's Planetary Data System. Through a collaboration with Google, Inc., the User Centered Technologies group at NASA Ames Resarch Center has developed at tool for searching and browsing among images from multiple Mars and Moon missions. Development of this tool was facilitated by the use of several open technologies and standards. First, an open-source full-text search engine is used to search both place names on the target and to find images matching a geographic region. Second, the published API of the Google Earth browser plugin is used to geolocate the images on a virtual globe and allow the user to navigate on the globe to see related images. The structure of the application also employs standard protocols and services. The back-end is exposed as RESTful APIs, which could be reused by other client systems in the future. Further, the communication between the front- and back-end portions of the system utilizes open data standards including XML and KML (Keyhole Markup Language) for representation of textual and geographic data. The creation of the search index was facilitated by reuse of existing, publicly available metadata, including the Gazetteer of Planetary Nomenclature from the USGS, available in KML format. And the image metadata was reused from standards-compliant archives in the Planetary Data System. The system also supports collaboration with other tools by allowing export of search results in KML, and the ability to display those results in the Google Earth desktop application. We will demonstrate the search and visualization capabilities of the system, with emphasis on how the system facilitates reuse of data and services through the adoption of open standards.
Experiments in fault tolerant software reliability
NASA Technical Reports Server (NTRS)
Mcallister, David F.; Tai, K. C.; Vouk, Mladen A.
1987-01-01
The reliability of voting was evaluated in a fault-tolerant software system for small output spaces. The effectiveness of the back-to-back testing process was investigated. Version 3.0 of the RSDIMU-ATS, a semi-automated test bed for certification testing of RSDIMU software, was prepared and distributed. Software reliability estimation methods based on non-random sampling are being studied. The investigation of existing fault-tolerance models was continued and formulation of new models was initiated.
A New mHealth Communication Framework for Use in Wearable WBANs and Mobile Technologies
Hamida, Sana Tmar-Ben; Hamida, Elyes Ben; Ahmed, Beena
2015-01-01
Driven by the development of biomedical sensors and the availability of high mobile bandwidth, mobile health (mHealth) systems are now offering a wider range of new services. This revolution makes the idea of in-home health monitoring practical and provides the opportunity for assessment in “real-world” environments producing more ecologically valid data. In the field of insomnia diagnosis, for example, it is now possible to offer patients wearable sleep monitoring systems which can be used in the comfort of their homes over long periods of time. The recorded data collected from body sensors can be sent to a remote clinical back-end system for analysis and assessment. Most of the research on sleep reported in the literature mainly looks into how to automate the analysis of the sleep data and does not address the problem of the efficient encoding and secure transmissions of the collected health data. This article reviews the key enabling communication technologies and research challenges for the design of efficient mHealth systems. An end-to-end mHealth system architecture enabling the remote assessment and monitoring of patient's sleep disorders is then proposed and described as a case study. Finally, various mHealth data serialization formats and machine-to-machine (M2M) communication protocols are evaluated and compared under realistic operating conditions. PMID:25654718
A new mHealth communication framework for use in wearable WBANs and mobile technologies.
Hamida, Sana Tmar-Ben; Hamida, Elyes Ben; Ahmed, Beena
2015-02-03
Driven by the development of biomedical sensors and the availability of high mobile bandwidth, mobile health (mHealth) systems are now offering a wider range of new services. This revolution makes the idea of in-home health monitoring practical and provides the opportunity for assessment in "real-world" environments producing more ecologically valid data. In the field of insomnia diagnosis, for example, it is now possible to offer patients wearable sleep monitoring systems which can be used in the comfort of their homes over long periods of time. The recorded data collected from body sensors can be sent to a remote clinical back-end system for analysis and assessment. Most of the research on sleep reported in the literature mainly looks into how to automate the analysis of the sleep data and does not address the problem of the efficient encoding and secure transmissions of the collected health data. This article reviews the key enabling communication technologies and research challenges for the design of efficient mHealth systems. An end-to-end mHealth system architecture enabling the remote assessment and monitoring of patient's sleep disorders is then proposed and described as a case study. Finally, various mHealth data serialization formats and machine-to-machine (M2M) communication protocols are evaluated and compared under realistic operating conditions.
Dhaneshwar, Amrut D; Chaurasiya, Ram Saran; Hebbar, H Umesh
2014-01-01
In the current study, reverse micellar extraction (RME) for the purification of stem bromelain was successfully achieved using the sodium bis(2-ethylhexyl) sulfosuccinate (AOT)/isooctane system. A maximum forward extraction efficiency of 58.0% was obtained at 100 mM AOT concentration, aqueous phase pH of 8.0 and 0.2 M NaCl. Back extraction studies on altering stripping phase pH and KCl concentration, addition of counter-ion and iso-propyl alcohol (IPA) and mechanical agitation with glass beads indicated that IPA addition and agitation with glass beads have significant effects on extraction efficiency. The protein extraction was higher (51.9%) in case of the IPA (10% v/v) added system during back extraction as compared to a cetyltrimethylammonium bromide (100 mM) added system (9.42%). The central composite design technique was used to optimize the back extraction conditions further. Concentration of IPA, amount of glass beads, mixing time, and agitation speed (in rpm) were the variables selected. IPA concentration of 8.5% (v/v), glass bead concentration of 0.6 (w/v), and mixing time of 45 min at 400 rpm resulted in higher back extraction efficiency of 45.6% and activity recovery of 88.8% with purification of 3.04-fold. The study indicated that mechanical agitation using glass beads could be used for destabilizing the reverse micelles and release of bromelain back into the fresh aqueous phase. © 2014 American Institute of Chemical Engineers.
Emerging Technologies: Small Satellite and Associated TPED
NASA Astrophysics Data System (ADS)
Zitz, R.
2014-09-01
The 2010 National Space Policy directs the U.S. space community, comprised of the Department of Defense, Intelligence Community, Military Services and NASA to examine our nation's ability to conduct space-based ISR and communications even during a period of peer state and near peer state attacks intended to deny us our advantages we accrue from our use of space systems. DOD and the ICs past experience is largely one of building small numbers of extraordinarily capable and expensive (exquisite) satellites for communications and ISR. As potential adversaries continue to develop cyber-attack capabilities and have demonstrated an ability to kinetically attack spacecraft, the vulnerability of our architecture is now a serious concern. In addition, the sluggish U.S. economy, the draw down and pull back from a decade of combat operations, and other factors have combined to force a significant reduction in DOD and IC spending over the coming decade(s). Simultaneously, DOD and the IC have a growing awareness that the long lead times and long mission duration of the exquisite space assets can lead to fielding technologies that become obsolete and mission limiting. Some DOD and IC leaders are now examining alternative architectures to provide lower cost, flexible, more diverse and rapidly launchable space systems. Government leaders are considering commercially hosted payloads in geosynchronous orbits and smaller, lower cost, free flying government and commercial satellites in low earth orbits. Additional changes to the ground tasking, processing, exploitation and dissemination (TPED) systems would ensure small satellites have end-to-end mission capability and meet emerging needs such as ease of tasking, multi-INT processing, and more advanced distribution mechanisms (e.g., to users on the move). Today, a majority of agency leaders and their subordinate program managers remain convinced that only large, expensive systems can truly answer requirements and provide reliable services. Champions for change to smaller, lower cost systems must demonstrate that technology and commercial business practices have evolved to the point that smaller, low cost, adequate performance is now achievable. This presentation concisely explains both sides of the debate and offers ideas for how to introduce smaller satellites and associated TPED solutions without incurring significant risk to existing missions.
Liu, Guorui; Yang, Lili; Zhan, Jiayu; Zheng, Minghui; Li, Li; Jin, Rong; Zhao, Yuyang; Wang, Mei
2016-12-01
Cement kilns can be used to co-process fly ash from municipal solid waste incinerators. However, this might increase emission of organic pollutants like polychlorinated biphenyls (PCBs). Knowledge of PCB concentrations and homolog and congener patterns at different stages in this process could be used to assess the possibility of simultaneously controlling emissions of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) and "dioxin-like" compounds. To date, emissions from cement kilns co-processing fly ash from municipal solid waste incinerators have not been analyzed for PCBs. In this study, stack gas and particulate samples from two cement kilns co-processing waste incinerator fly ash were analyzed for PCBs. The average total tri- to deca-chlorinated biphenyl (∑ 3-10 PCB) concentration in the stack gas samples was 10.15ngm -3 . The ∑ 3-10 PCB concentration ranges in particulate samples from different stages were 0.83-41.79ngg -1 for cement kiln 1and0.13-1.69ngg -1 for cement kiln 2. The ∑ 3-10 PCB concentrations were much higher in particulate samples from the suspension pre-heater boiler, humidifier tower, and kiln back-end bag filters than in particulate samples from other stages. For these three stages, PCBs contributed to 15-18% of the total PCB, PCDD/F, and polychlorinated naphthalene toxic equivalents in stack gases and particulate matter. The PCB distributions were similar to those found in other studies for PCDD/Fs and polychlorinated naphthalenes, which suggest that it may be possible to simultaneously control emissions of multiple organic pollutants from cement kilns. Homolog patterns in the particulate samples were dominated by the pentachlorobiphenyls. CB-105, CB-118, and CB-123 were the dominant dioxin-like PCB congeners that formed at the back-end of the cement kiln. A mass balance of PCBs in the cement kilns indicated that the total mass of PCBs in the stack gases and clinker was about half the mass of PCBs in the raw materials. Copyright © 2016 Elsevier Ltd. All rights reserved.
Miller, Jordan; Barber, David; Donnelly, Catherine; French, Simon; Green, Michael; Hill, Jonathan; MacDermid, Joy; Marsh, Jacquelyn; Norman, Kathleen; Richardson, Julie; Taljaard, Monica; Wideman, Timothy; Cooper, Lynn; McPhee, Colleen
2017-11-09
Back pain is a leading contributor to disability, healthcare costs, and lost work. Family physicians are the most common first point of contact in the healthcare system for people with back pain, but physiotherapists (PTs) may be able to support the primary care team through evidence-based primary care. A cluster randomized trial is needed to determine the clinical, health system, and societal impact of a primary care model that integrates physiotherapists at the first visit for people with back pain. Prior to conducting a future fully powered cluster randomized trial, we need to demonstrate feasibility of the methods. Therefore, the purpose of this pilot study will be to: 1) Determine feasibility of patient recruitment, assessment procedures, and retention. 2) Determine the feasibility of training and implementation of a new PT-led primary care model for low back pain (LBP) 3) Explore the perspectives of patients and healthcare providers (HCPs) related to their experiences and attitudes towards the new service delivery model, barriers/facilitators to implementation, perceived satisfaction, perceived value, and impact on clinic processes and patient outcomes. This pilot cluster randomized controlled trial will enroll four sites and randomize them to implement a new PT-led primary care model for back pain or a usual physician-led primary care model. All adults booking a primary care visit for back pain will be invited to participate. Feasibility outcomes will include: recruitment and retention rates, completeness of assessment data, PT training participation and confidence after training, and PT treatment fidelity. Secondary outcomes will include the clinical, health system, cost, and process outcomes planned for the future fully powered cluster trial. Results will be analyzed and reported descriptively and qualitatively. To explore perspectives of both HCPs and patients, we will conduct semi-structured qualitative interviews with patients and focus groups with HCPs from participants in the PT-led primary care sites. If this pilot demonstrates feasibility, a fully powered trial will provide evidence that has the potential to transform primary care for back pain. The full trial will inform future service design, whether these models should be more widely implemented, and training agendas. ClinicalTrials.gov, NCT03320148 . Submitted for registration on 17 September 2017.
Analysis of DIRAC's behavior using model checking with process algebra
NASA Astrophysics Data System (ADS)
Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof
2012-12-01
DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.
NASA Astrophysics Data System (ADS)
Nath, Nayani Kishore
2017-08-01
The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.
NASA Astrophysics Data System (ADS)
Kulse, P.; Sasai, K.; Schulz, K.; Wietstruck, M.
2017-06-01
In the last decades the semiconductor technology has been driven by Moore's law leading to high performance CMOS technologies with feature sizes of less than 10 nm [1]. It has been pointed out that not only scaling but also the integration of novel components and technology modules into CMOS/BiCMOS technologies is becoming more attractive to realize smart and miniaturized systems [2]. Driven by new applications in the area of communication, health and automation, new components and technology modules such as BiCMOS embedded RF-MEMS, high-Q passives, Sibased microfluidics and InP-SiGe BiCMOS heterointegration have been demonstrated [3-6]. In contrast to standard VLSI processes fabricated on front side of the silicon wafer, these new technology modules require addition backside processing of the wafer; thus an accurate alignment between the front and backside of the wafer is mandatory. In previous work an advanced back to front side alignment technique and implementation into IHP's 0.25/0.13 μm high performance SiGe:C BiCMOS backside process module has been presented [7]. The developed technique enables a high resolution and accurate lithography on the backside of BiCMOS wafer for additional backside processing. In addition to the aforementioned back side process technologies, new applications like Through-Silicon Vias (TSV) for interposers and advanced substrate technologies for 3D heterogeneous integration demand not only single wafer fabrication but also processing of wafer stacks provided by temporary and permanent wafer bonding [8]. Therefore, the available overlay measurement techniques are not suitable if overlay and alignment marks are realized at the bonding interface of a wafer stack which consists of both a silicon device and a silicon carrier wafer. The former used EVG 40NT automated overlay measurement system, which use two opposite positioned microscopes inspecting simultaneous the wafer back and front side, is not capable measuring embedded overlay marks. In this work, the non-contact infrared alignment system of the Nikon i-line Stepper NSR-SF150 for both the alignment and the overlay determination of bonded wafer stacks with embedded alignment marks are used to achieve an accurate alignment between the different wafer sides. The embedded field image alignment (FIA) marks of the interface and the device wafer top layer are measured in a single measurement job. By taking the offsets between all different FIA's into account, after correcting the wafer rotation induced FIA position errors, hence an overlay for the stacked wafers can be determined. The developed approach has been validated by a standard back to front side application. The overlay was measured and determined using both, the EVG NT40 automated measurement system with special overlay marks and the measurement of the FIA marks of the front and back side layer. A comparison of both results shows mismatches in x and y translations smaller than 200 nm, which is relatively small compared to the overlay tolerances of +/-500 nm for the back to front side process. After the successful validation of the developed technique, special wafer stacks with FIA alignment marks in the bonding interface are fabricated. Due to the super IR light transparency of both doubled side polished wafers, the embedded FIA marks generate a stable and clear signal for accurate x and y wafer coordinate positioning. The FIA marks of the device wafer top layer were measured under standard condition in a developed photoresist mask without IR illumination. Following overlay calculation shows an overlay of less than 200 nm, which enables very accurate process condition for highly scaled TSV integration and advanced substrate integration into IHP's 0.25/0.13 μm SiGe:C BiCMOS technology. The presented method can be applied for both the standard back to front side process technologies and also new temporary and permanent wafer bonding applications.
Q-switched all-solid-state lasers and application in processing of thin-film solar cell
NASA Astrophysics Data System (ADS)
Liu, Liangqing; Wang, Feng
2009-08-01
Societal pressure to renewable clean energy is increasing which is expected to be used as part of an overall strategy to address global warming and oil crisis. Photovoltaic energy conversion devices are on a rapidly accelerating growth path driven by government, of which the costs and prices lower continuously. The next generation thin-film devices are considered to be more efficiency and greatly reduced silicon consumption, resulting in dramatically lower per unit fabrication costs. A key aspect of these devices is patterning large panels to create a monolithic array of series-interconnected cells to form a low current, high voltage module. This patterning is accomplished in three critical scribing processes called P1, P2, and P3. All-solid-state Q-switched lasers are the technology of choice for these processes, due to their advantages of compact configuration, high peak-value power, high repeat rate, excellent beam quality and stability, delivering the desired combination of high throughput and narrow, clean scribes. The end pumped all-solid-state lasers could achieve 1064nm IR resources with pulse width of nanoseconds adopting acoustic-optics Q-switch, shorter than 20ns. The repeat rate is up to 100kHz and the beam quality is close to diffraction limit. Based on this, 532nm green lasers, 355nm UV lasers and 266nm DUV lasers could be carried out through nonlinear frequency conversion. Different wave length lasers are chose to process selective materials. For example, 8-15 W IR lasers are used to scribe the TCO film (P1); 1-5 W green lasers are suitable for scribing the active semiconductor layers (P2) and the back contact layers (P3). Our company, Wuhan Lingyun Photo-electronic System Co. Ltd, has developed 20W IR and 5W green end-pumped Q-switched all-solid-state lasers for thin-film solar industry. Operating in high repeat rates, the speed of processing is up to 2.0 m/s.
NASA Astrophysics Data System (ADS)
Onizawa, Naoya; Tamakoshi, Akira; Hanyu, Takahiro
2017-08-01
In this paper, reinitialization-free nonvolatile computer systems are designed and evaluated for energy-harvesting Internet of things (IoT) applications. In energy-harvesting applications, as power supplies generated from renewable power sources cause frequent power failures, data processed need to be backed up when power failures occur. Unless data are safely backed up before power supplies diminish, reinitialization processes are required when power supplies are recovered, which results in low energy efficiencies and slow operations. Using nonvolatile devices in processors and memories can realize a faster backup than a conventional volatile computer system, leading to a higher energy efficiency. To evaluate the energy efficiency upon frequent power failures, typical computer systems including processors and memories are designed using 90 nm CMOS or CMOS/magnetic tunnel junction (MTJ) technologies. Nonvolatile ARM Cortex-M0 processors with 4 kB MRAMs are evaluated using a typical computing benchmark program, Dhrystone, which shows a few order-of-magnitude reductions in energy in comparison with a volatile processor with SRAM.
USDA-ARS?s Scientific Manuscript database
: Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wa...
USDA-ARS?s Scientific Manuscript database
Large amount of water is used for processing of our food supplies, especially in meat processing plants. The resulting amount of wastewater cannot be discarded freely back into natural settings due to regulatory mandates, whether the sinks would be rivers, ponds, or other natural systems. These wast...
Implementing the Gaia Astrometric Global Iterative Solution (AGIS) in Java
NASA Astrophysics Data System (ADS)
O'Mullane, William; Lammers, Uwe; Lindegren, Lennart; Hernandez, Jose; Hobbs, David
2011-10-01
This paper provides a description of the Java software framework which has been constructed to run the Astrometric Global Iterative Solution for the Gaia mission. This is the mathematical framework to provide the rigid reference frame for Gaia observations from the Gaia data itself. This process makes Gaia a self calibrated, and input catalogue independent, mission. The framework is highly distributed typically running on a cluster of machines with a database back end. All code is written in the Java language. We describe the overall architecture and some of the details of the implementation.
Rainey, R.H.; Moore, J.G.
1962-08-14
A liquid-liquid extraction process was developed for recovering thorium and uranium values from a neutron irradiated thorium composition. They are separated from a solvent extraction system comprising a first end extraction stage for introducing an aqueous feed containing thorium and uranium into the system consisting of a plurality of intermediate extractiorr stages and a second end extractron stage for introducing an aqueous immiscible selective organic solvent for thorium and uranium in countercurrent contact therein with the aqueous feed. A nitrate iondeficient aqueous feed solution containing thorium and uranium was introduced into the first end extraction stage in countercurrent contact with the organic solvent entering the system from the second end extraction stage while intro ducing an aqueous solution of salting nitric acid into any one of the intermediate extraction stages of the system. The resultant thorium and uranium-laden organic solvent was removed at a point preceding the first end extraction stage of the system. (AEC)
Integrated mixed signal control IC for 500-kHz switching frequency buck regulator
NASA Astrophysics Data System (ADS)
Chen, Keng; Zhang, Hong
2015-12-01
The main purpose for this work is to study the challenges of designing a digital buck regulator using pipelined analog to digital converter (ADC). Although pipelined ADC can achieve high sampling speed, it will introduce additional phase lag to the buck circuit. Along with the latency brought by processing time of additional digital circuits, as well as the time delay associated with the switching frequency, the closed loop will be unstable; moreover, raw ADC outputs have low signal-to-noise ratio, which usually need back-end calibration. In order to compensate these phase lag and make control loop unconditional stable, as well as boost up signal-to-noise ratio of the ADC block with cost-efficient design, a finite impulse response filter followed by digital proportional-integral-derivative blocks were designed. All these digital function blocks were optimised with processing speed. In the system simulation, it can be found that this controller achieved output regulation within 10% of nominal 5 V output voltage under 1 A/µs load transient condition; moreover, with the soft-start method, there is no turn-on overshooting. The die size of this controller is controlled within 3 mm2 by using 180 nm CMOS technology.
End-use quality of soft kernel durum wheat
USDA-ARS?s Scientific Manuscript database
Kernel texture is a major determinant of end-use quality of wheat. Durum wheat has very hard kernels. We developed soft kernel durum wheat via Ph1b-mediated homoeologous recombination. The Hardness locus was transferred from Chinese Spring to Svevo durum wheat via back-crossing. ‘Soft Svevo’ had SKC...
DAS: A Data Management System for Instrument Tests and Operations
NASA Astrophysics Data System (ADS)
Frailis, M.; Sartor, S.; Zacchei, A.; Lodi, M.; Cirami, R.; Pasian, F.; Trifoglio, M.; Bulgarelli, A.; Gianotti, F.; Franceschi, E.; Nicastro, L.; Conforti, V.; Zoli, A.; Smart, R.; Morbidelli, R.; Dadina, M.
2014-05-01
The Data Access System (DAS) is a and data management software system, providing a reusable solution for the storage of data acquired both from telescopes and auxiliary data sources during the instrument development phases and operations. It is part of the Customizable Instrument WorkStation system (CIWS-FW), a framework for the storage, processing and quick-look at the data acquired from scientific instruments. The DAS provides a data access layer mainly targeted to software applications: quick-look displays, pre-processing pipelines and scientific workflows. It is logically organized in three main components: an intuitive and compact Data Definition Language (DAS DDL) in XML format, aimed for user-defined data types; an Application Programming Interface (DAS API), automatically adding classes and methods supporting the DDL data types, and providing an object-oriented query language; a data management component, which maps the metadata of the DDL data types in a relational Data Base Management System (DBMS), and stores the data in a shared (network) file system. With the DAS DDL, developers define the data model for a particular project, specifying for each data type the metadata attributes, the data format and layout (if applicable), and named references to related or aggregated data types. Together with the DDL user-defined data types, the DAS API acts as the only interface to store, query and retrieve the metadata and data in the DAS system, providing both an abstract interface and a data model specific one in C, C++ and Python. The mapping of metadata in the back-end database is automatic and supports several relational DBMSs, including MySQL, Oracle and PostgreSQL.
In-Process Atomic-Force Microscopy (AFM) Based Inspection
Mekid, Samir
2017-01-01
A new in-process atomic-force microscopy (AFM) based inspection is presented for nanolithography to compensate for any deviation such as instantaneous degradation of the lithography probe tip. Traditional method used the AFM probes for lithography work and retract to inspect the obtained feature but this practice degrades the probe tip shape and hence, affects the measurement quality. This paper suggests a second dedicated lithography probe that is positioned back-to-back to the AFM probe under two synchronized controllers to correct any deviation in the process compared to specifications. This method shows that the quality improvement of the nanomachining, in progress probe tip wear, and better understanding of nanomachining. The system is hosted in a recently developed nanomanipulator for educational and research purposes. PMID:28561747
CubeSat Form Factor Thermal Control Louvers
NASA Technical Reports Server (NTRS)
Evans, Allison L. (Inventor)
2018-01-01
Thermal control louvers for CubeSats or small spacecraft may include a plurality of springs attached to a back panel of the thermal control louvers. The thermal control louvers may also include a front panel, which includes at least two end panels interlocked with one or more middle panels. The front panel may secure the springs, shafts, and flaps to the back panel.
NASA Technical Reports Server (NTRS)
Sinderson, Elias; Magapu, Vish; Mak, Ronald
2004-01-01
We describe the design and deployment of the middleware for the Collaborative Information Portal (CIP), a mission critical J2EE application developed for NASA's 2003 Mars Exploration Rover mission. CIP enabled mission personnel to access data and images sent back from Mars, staff and event schedules, broadcast messages and clocks displaying various Earth and Mars time zones. We developed the CIP middleware in less than two years time usins cutting-edge technologies, including EJBs, servlets, JDBC, JNDI and JMS. The middleware was designed as a collection of independent, hot-deployable web services, providing secure access to back end file systems and databases. Throughout the middleware we enabled crosscutting capabilities such as runtime service configuration, security, logging and remote monitoring. This paper presents our approach to mitigating the challenges we faced, concluding with a review of the lessons we learned from this project and noting what we'd do differently and why.
NASA Technical Reports Server (NTRS)
Brown, J. W.; Cleven, G. C.; Klose, J. C.; Lame, D. B.; Yamarone, C. A.
1979-01-01
The Seasat low-rate data system, an end-to-end data-processing and data-distribution system for the four low-rate sensors (radar altimeter, Seasat-A scatterometer system, scanning multichannel microwave radiometer, and visible and infrared radiometer) carried aboard the satellite, is discussed. The function of the distributed, nonreal-time, magnetic-tape system is to apply necessary calibrations, corrections, and conversions to yield geophysically meaningful products from raw telemetry data. The algorithms developed for processing data from the different sensors are described, together with the data catalogs compiled.
Rosnah, I; Noor Hassim, I; Shafizah, A S
2013-10-01
The Three-Factor Eating Questionnaire was first constructed to measure eating behavior in an English population in the United States. It has been validated and translated for various populations in different languages. The aim of this article is to describe a systematic process for translating the questionnaire from English to Malay language. The report of the International Society for Pharmacoeconomics and Outcome Research (ISPOR) Task Force was used as the basis for the systematic translation process. The process began with preparation; followed by forward translation (2 independent translators), reconciliation, back translation (2 independent translators), back translation review, harmonization, cognitive debriefing, review of cognitive debriefing results and finalization, proofreading; and ended with the final report. Four independent Malay translators who fluent in English and reside in Malaysia were involved in the process. A team of health care researchers had assisted the review of the new translated questionnaires. Majority of the TFEQ-R21 items were experiencing, conceptually and semantically equivalence between original English and translated English. However, certain phrase such as "feels like bottomless pit" was difficult to translate by forward translators. Cognitive debriefing was a very helpful process to ensure the TFEQ-R21 Malay version was appropriate in term of wording and culturally accepted. A total of four redundant comments in regards to response scale wording, word confusion and wording arrangement. The systematic translation process is a way to reduce the linguistic discrepancies between the English and Malay language in order to promote equivalence and culturally adapted TFEQ-R21 questionnaire.