Sample records for data reduction

  1. Polarimetry Data Reduction at the Joint Astronomy Centre

    NASA Astrophysics Data System (ADS)

    Cavanagh, B.; Jenness, T.; Currie, M. J.

    2005-12-01

    ORAC-DR is an automated data-reduction pipeline that has been used for on-line data reduction for infrared imaging, spectroscopy, and integral-field-unit data at UKIRT; sub-millimetre imaging at JCMT; and infrared imaging at AAT. It allows for real-time automated infrared and submillmetre imaging polarimetry and spectropolarimetry data reduction. This paper describes the polarimetry data-reduction pipelines used at the Joint Astronomy Centre, highlighting their flexibility and extensibility.

  2. Development of an expert data reduction assistant

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.

    1992-01-01

    We propose the development of an expert system tool for the management and reduction of complex data sets. The proposed work is an extension of a successful prototype system for the calibration of CCD images developed by Dr. Johnston in 1987. The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.

  3. The SCUBA Data Reduction Pipeline: ORAC-DR at the JCMT

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie

    The ORAC data reduction pipeline, developed for UKIRT, has been designed to be a completely general approach to writing data reduction pipelines. This generality has enabled the JCMT to adapt the system for use with SCUBA with minimal development time using the existing SCUBA data reduction algorithms (Surf).

  4. Automating OSIRIS Data Reduction for the Keck Observatory Archive

    NASA Astrophysics Data System (ADS)

    Holt, J.; Tran, H. D.; Goodrich, R.; Berriman, G. B.; Gelino, C. R.; KOA Team

    2014-05-01

    By the end of 2013, the Keck Observatory Archive (KOA) will serve data from all active instruments on the Keck Telescopes. OSIRIS (OH-Suppressing Infra-Red Imaging Spectrograph), the last active instrument to be archived in KOA, has been in use behind the (AO) system at Keck since February 2005. It uses an array of tiny lenslets to simultaneously produce spectra at up to 4096 locations. Due to the complicated nature of the OSIRIS raw data, the OSIRIS team developed a comprehensive data reduction program. This data reduction system has an online mode for quick real-time reductions, which are used primarily for basic data visualization and quality assessment done at the telescope while observing. The offline version of the data reduction system includes an expanded reduction method list, does more iterations for a better construction of the data cubes, and is used to produce publication-quality products. It can also use reconstruction matrices that are developed after the observations were taken, and are more refined. The KOA team is currently utilizing the standard offline reduction mode to produce quick-look browse products for the raw data. Users of the offline data reduction system generally use a graphical user interface to manually setup the reduction parameters. However, in order to reduce and serve the 200,000 science files on disk, all of the reduction parameters and steps need to be fully automated. This pipeline will also be used to automatically produce quick-look browse products for future OSIRIS data after each night's observations. Here we discuss the complexities of OSIRIS data, the reduction system in place, methods for automating the system, performance using virtualization, and progress made to date in generating the KOA products.

  5. Automating OSIRIS Data Reduction for the Keck Observatory Archive

    NASA Astrophysics Data System (ADS)

    Tran, Hien D.; Holt, J.; Goodrich, R. W.; Lyke, J. E.; Gelino, C. R.; Berriman, G. B.; KOA Team

    2014-01-01

    Since the end of 2013, the Keck Observatory Archive (KOA) has served data from all active instruments on the Keck Telescopes. OSIRIS (OH-Suppressing Infra-Red Imaging Spectrograph), the last active instrument to be archived in KOA, has been in use behind the adaptive optics (AO) system at Keck since February 2005. It uses an array of tiny lenslets to simultaneously produce spectra at up to 4096 locations. Due to the complicated nature of the OSIRIS raw data, the OSIRIS team developed a comprehensive data reduction program. This data reduction system has an online mode for quick real-time reductions which are used primarily for basic data visualization and quality assessment done at the telescope while observing. The offline version of the data reduction system includes an expanded reduction method list, does more iterations for a better construction of the data cubes, and is used to produce publication-quality products. It can also use reconstruction matrices that are developed after the observations were taken, and are more refined. The KOA team is currently utilizing the standard offline reduction mode to produce quick-look browse products for the raw data. Users of the offline data reduction system generally use a graphical user interface to manually setup the reduction parameters. However, in order to reduce and serve the ~200,000 science files on disk, all of the reduction parameters and steps need to be fully automated. This pipeline will also be used to automatically produce quick-look browse products for future OSIRIS data after each night's observations. Here we discuss the complexities of OSIRIS data, the reduction system in place, methods for automating the system, performance using virtualization, and progress made to date in generating the KOA products.

  6. Development of an expert data reduction assistant

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.

    1993-01-01

    We propose the development of an expert system tool for the management and reduction of complex datasets. the proposed work is an extension of a successful prototype system for the calibration of CCD (charge coupled device) images developed by Dr. Johnston in 1987. (ref.: Proceedings of the Goddard Conference on Space Applications of Artificial Intelligence). The reduction of complex multi-parameter data sets presents severe challenges to a scientist. Not only must a particular data analysis system be mastered, (e.g. IRAF/SDAS/MIDAS), large amounts of data can require many days of tedious work and supervision by the scientist for even the most straightforward reductions. The proposed Expert Data Reduction Assistant will help the scientist overcome these obstacles by developing a reduction plan based on the data at hand and producing a script for the reduction of the data in a target common language.

  7. The X-shooter pipeline

    NASA Astrophysics Data System (ADS)

    Goldoni, P.

    2011-03-01

    The X-shooter data reduction pipeline is an integral part of the X-shooter project, it allows the production of reduced data in physical quantities from the raw data produced by the instrument. The pipeline is based on the data reduction library developed by the X-shooter consortium with contributions from France, The Netherlands and ESO and it uses the Common Pipeline Library (CPL) developed at ESO. The pipeline has been developed for two main functions. The first function is to monitor the operation of the instrument through the reduction of the acquired data, both at Paranal, for a quick-look control, and in Garching, for a more thorough evaluation. The second function is to allow an optimized data reduction for a scientific user. In the following I will first outline the main steps of data reduction with the pipeline then I will briefly show two examples of optimization of the results for science reduction.

  8. Horizontal decomposition of data table for finding one reduct

    NASA Astrophysics Data System (ADS)

    Hońko, Piotr

    2018-04-01

    Attribute reduction, being one of the most essential tasks in rough set theory, is a challenge for data that does not fit in the available memory. This paper proposes new definitions of attribute reduction using horizontal data decomposition. Algorithms for computing superreduct and subsequently exact reducts of a data table are developed and experimentally verified. In the proposed approach, the size of subtables obtained during the decomposition can be arbitrarily small. Reducts of the subtables are computed independently from one another using any heuristic method for finding one reduct. Compared with standard attribute reduction methods, the proposed approach can produce superreducts that usually inconsiderably differ from an exact reduct. The approach needs comparable time and much less memory to reduce the attribute set. The method proposed for removing unnecessary attributes from superreducts executes relatively fast for bigger databases.

  9. Alternative Fuels Data Center: Heavy-Duty Truck Idle Reduction Technologies

    Science.gov Websites

    reduction technologies. Both DOE and the U.S. Environmental Protection Agency (EPA) provide information Heavy-Duty Truck Idle Reduction Technologies to someone by E-mail Share Alternative Fuels Data Center: Heavy-Duty Truck Idle Reduction Technologies on Facebook Tweet about Alternative Fuels Data

  10. Application research on big data in energy conservation and emission reduction of transportation industry

    NASA Astrophysics Data System (ADS)

    Bai, Bingdong; Chen, Jing; Wang, Mei; Yao, Jingjing

    2017-06-01

    In the context of big data age, the energy conservation and emission reduction of transportation is a natural big data industry. The planning, management, decision-making of energy conservation and emission reduction of transportation and other aspects should be supported by the analysis and forecasting of large amounts of data. Now, with the development of information technology, such as intelligent city, sensor road and so on, information collection technology in the direction of the Internet of things gradually become popular. The 3G/4G network transmission technology develop rapidly, and a large number of energy conservation and emission reduction of transportation data is growing into a series with different ways. The government not only should be able to make good use of big data to solve the problem of energy conservation and emission reduction of transportation, but also to explore and use a large amount of data behind the hidden value. Based on the analysis of the basic characteristics and application technology of energy conservation and emission reduction of transportation data, this paper carries out its application research in energy conservation and emission reduction of transportation industry, so as to provide theoretical basis and reference value for low carbon management.

  11. Delivering data reduction pipelines to science users

    NASA Astrophysics Data System (ADS)

    Freudling, Wolfram; Romaniello, Martino

    2016-07-01

    The European Southern Observatory has a long history of providing specialized data processing algorithms, called recipes, for most of its instruments. These recipes are used for both operational purposes at the observatory sites, and for data reduction by the scientists at their home institutions. The two applications require substantially different environments for running and controlling the recipes. In this papers, we describe the ESOReflex environment that is used for running recipes on the users' desktops. ESOReflex is a workflow driven data reduction environment. It allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection of and interaction with the data. It includes fully automatic data organization and visualization, interaction with recipes, and the exploration of the provenance tree of intermediate and final data products. ESOReflex uses a number of innovative concepts that have been described in Ref. 1. In October 2015, the complete system was released to the public. ESOReflex allows highly efficient data reduction, using its internal bookkeeping database to recognize and skip previously completed steps during repeated processing of the same or similar data sets. It has been widely adopted by the science community for the reduction of VLT data.

  12. Automated Reduction and Calibration of SCUBA Archive Data Using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N.; Robson, E. I.; Tilanus, R. P. J.; Holland, W. S.

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used for investigating instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data with particular emphasis on the pointing observations. This is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on UKIRT and the JCMT.

  13. 48 CFR 52.214-27 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications-Sealed Bidding.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reduction. This right to a price reduction is limited to that resulting from defects in data relating to... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Price Reduction for... PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.214-27 Price Reduction for Defective...

  14. Data reduction and analysis of HELIOS plasma wave data

    NASA Technical Reports Server (NTRS)

    Anderson, Roger R.

    1988-01-01

    Reduction of data acquired from the HELIOS Solar Wind Plasma Wave Experiments on HELIOS 1 and 2 was continued. Production of 24 hour survey plots of the HELIOS 1 plasma wave data were continued and microfilm copies were submitted to the National Space Science Data Center. Much of the effort involved the shock memory from both HELIOS 1 and 2. This data had to be deconvoluted and time ordered before it could be displayed and plotted in an organized form. The UNIVAX 418-III computer was replaced by a DEC VAX 11/780 computer. In order to continue the reduction and analysis of the data set, all data reduction and analysis computer programs had to be rewritten.

  15. Alternative Fuels Data Center: Idle Reduction Laws and Incentives

    Science.gov Websites

    Conserve Fuel Printable Version Share this resource Send a link to Alternative Fuels Data Center : Idle Reduction Laws and Incentives to someone by E-mail Share Alternative Fuels Data Center: Idle Fuels Data Center: Idle Reduction Laws and Incentives on Digg Find More places to share Alternative

  16. Variance Reduction Factor of Nuclear Data for Integral Neutronics Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiba, G., E-mail: go_chiba@eng.hokudai.ac.jp; Tsuji, M.; Narabayashi, T.

    We propose a new quantity, a variance reduction factor, to identify nuclear data for which further improvements are required to reduce uncertainties of target integral neutronics parameters. Important energy ranges can be also identified with this variance reduction factor. Variance reduction factors are calculated for several integral neutronics parameters. The usefulness of the variance reduction factors is demonstrated.

  17. Intelligent Data Reduction (IDARE)

    NASA Technical Reports Server (NTRS)

    Brady, D. Michael; Ford, Donnie R.

    1990-01-01

    A description of the Intelligent Data Reduction (IDARE) expert system and an IDARE user's manual are given. IDARE is a data reduction system with the addition of a user profile infrastructure. The system was tested on a nickel-cadmium battery testbed. Information is given on installing, loading, maintaining the IDARE system.

  18. A Fourier dimensionality reduction model for big data interferometric imaging

    NASA Astrophysics Data System (ADS)

    Vijay Kartik, S.; Carrillo, Rafael E.; Thiran, Jean-Philippe; Wiaux, Yves

    2017-06-01

    Data dimensionality reduction in radio interferometry can provide savings of computational resources for image reconstruction through reduced memory footprints and lighter computations per iteration, which is important for the scalability of imaging methods to the big data setting of the next-generation telescopes. This article sheds new light on dimensionality reduction from the perspective of the compressed sensing theory and studies its interplay with imaging algorithms designed in the context of convex optimization. We propose a post-gridding linear data embedding to the space spanned by the left singular vectors of the measurement operator, providing a dimensionality reduction below image size. This embedding preserves the null space of the measurement operator and hence its sampling properties are also preserved in light of the compressed sensing theory. We show that this can be approximated by first computing the dirty image and then applying a weighted subsampled discrete Fourier transform to obtain the final reduced data vector. This Fourier dimensionality reduction model ensures a fast implementation of the full measurement operator, essential for any iterative image reconstruction method. The proposed reduction also preserves the independent and identically distributed Gaussian properties of the original measurement noise. For convex optimization-based imaging algorithms, this is key to justify the use of the standard ℓ2-norm as the data fidelity term. Our simulations confirm that this dimensionality reduction approach can be leveraged by convex optimization algorithms with no loss in imaging quality relative to reconstructing the image from the complete visibility data set. Reconstruction results in simulation settings with no direction dependent effects or calibration errors show promising performance of the proposed dimensionality reduction. Further tests on real data are planned as an extension of the current work. matlab code implementing the proposed reduction method is available on GitHub.

  19. 48 CFR 52.215-10 - Price Reduction for Defective Certified Cost or Pricing Data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Price Reduction for... Text of Provisions and Clauses 52.215-10 Price Reduction for Defective Certified Cost or Pricing Data. As prescribed in 15.408(b), insert the following clause: Price Reduction for Defective Certified Cost...

  20. The Gemini Recipe System: a dynamic workflow for automated data reduction

    NASA Astrophysics Data System (ADS)

    Labrie, Kathleen; Allen, Craig; Hirst, Paul; Holt, Jennifer; Allen, River; Dement, Kaniela

    2010-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. The data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps, called Primitives, which are written in Python and can be launched from the PyRAF user interface by users wishing to use them interactively for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines.

  1. Observing control and data reduction at the UKIRT

    NASA Astrophysics Data System (ADS)

    Bridger, Alan; Economou, Frossie; Wright, Gillian S.; Currie, Malcolm J.

    1998-07-01

    For the past seven years observing with the major instruments at the United Kingdom IR Telescope (UKIRT) has been semi-automated, using ASCII files top configure the instruments and then sequence a series of exposures and telescope movements to acquire the data. For one instrument automatic data reduction completes the cycle. The emergence of recent software technologies has suggested an evolution of this successful system to provide a friendlier and more powerful interface to observing at UKIRT. The Observatory Reduction and Acquisition Control (ORAC) project is now underway to construct this system. A key aim of ORAC is to allow a more complete description of the observing program, including the target sources and the recipe that will be used to provide on-line data reduction. Remote observation preparation and submission will also be supported. In parallel the observatory control system will be upgraded to use these descriptions for more automatic observing, while retaining the 'classical' interactive observing mode. The final component of the project is an improved automatic data reduction system, allowing on-line reduction of data at the telescope while retaining the flexibility to cope with changing observing techniques and instruments. The user will also automatically be provided with the scripts used for the real-time reduction to help provide post-observing data reduction support. The overall project goal is to improve the scientific productivity of the telescope, but it should also reduce the overall ongoing support requirements, and has the eventual goal of supporting the use of queue- scheduled observing.

  2. The DEEP-South: Scheduling and Data Reduction Software System

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  3. Strain Gauge Balance Calibration and Data Reduction at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Ferris, A. T. Judy

    1999-01-01

    This paper will cover the standard force balance calibration and data reduction techniques used at Langley Research Center. It will cover balance axes definition, balance type, calibration instrumentation, traceability of standards to NIST, calibration loading procedures, balance calibration mathematical model, calibration data reduction techniques, balance accuracy reporting, and calibration frequency.

  4. Data reduction expert assistant

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.; Johnston, Mark D.; Hanisch, Robert J.

    1991-01-01

    Viewgraphs on data reduction expert assistant are presented. Topics covered include: data analysis systems; philosophy of these systems; disadvantages; expert assistant; useful goals; and implementation considerations.

  5. The Gemini Recipe System: A Dynamic Workflow for Automated Data Reduction

    NASA Astrophysics Data System (ADS)

    Labrie, K.; Hirst, P.; Allen, C.

    2011-07-01

    Gemini's next generation data reduction software suite aims to offer greater automation of the data reduction process without compromising the flexibility required by science programs using advanced or unusual observing strategies. The Recipe System is central to our new data reduction software. Developed in Python, it facilitates near-real time processing for data quality assessment, and both on- and off-line science quality processing. The Recipe System can be run as a standalone application or as the data processing core of an automatic pipeline. Building on concepts that originated in ORAC-DR, a data reduction process is defined in a Recipe written in a science (as opposed to computer) oriented language, and consists of a sequence of data reduction steps called Primitives. The Primitives are written in Python and can be launched from the PyRAF user interface by users wishing for more hands-on optimization of the data reduction process. The fact that the same processing Primitives can be run within both the pipeline context and interactively in a PyRAF session is an important strength of the Recipe System. The Recipe System offers dynamic flow control allowing for decisions regarding processing and calibration to be made automatically, based on the pixel and the metadata properties of the dataset at the stage in processing where the decision is being made, and the context in which the processing is being carried out. Processing history and provenance recording are provided by the AstroData middleware, which also offers header abstraction and data type recognition to facilitate the development of instrument-agnostic processing routines. All observatory or instrument specific definitions are isolated from the core of the AstroData system and distributed in external configuration packages that define a lexicon including classifications, uniform metadata elements, and transformations.

  6. Data Centric Sensor Stream Reduction for Real-Time Applications in Wireless Sensor Networks

    PubMed Central

    Aquino, Andre Luiz Lins; Nakamura, Eduardo Freire

    2009-01-01

    This work presents a data-centric strategy to meet deadlines in soft real-time applications in wireless sensor networks. This strategy considers three main aspects: (i) The design of real-time application to obtain the minimum deadlines; (ii) An analytic model to estimate the ideal sample size used by data-reduction algorithms; and (iii) Two data-centric stream-based sampling algorithms to perform data reduction whenever necessary. Simulation results show that our data-centric strategies meet deadlines without loosing data representativeness. PMID:22303145

  7. D-region blunt probe data analysis using hybrid computer techniques

    NASA Technical Reports Server (NTRS)

    Burkhard, W. J.

    1973-01-01

    The feasibility of performing data reduction techniques with a hybrid computer was studied. The data was obtained from the flight of a parachute born probe through the D-region of the ionosphere. A presentation of the theory of blunt probe operation is included with emphasis on the equations necessary to perform the analysis. This is followed by a discussion of computer program development. Included in this discussion is a comparison of computer and hand reduction results for the blunt probe launched on 31 January 1972. The comparison showed that it was both feasible and desirable to use the computer for data reduction. The results of computer data reduction performed on flight data acquired from five blunt probes are also presented.

  8. The JCMT Gould Belt Survey: a quantitative comparison between SCUBA-2 data reduction methods

    NASA Astrophysics Data System (ADS)

    Mairs, S.; Johnstone, D.; Kirk, H.; Graves, S.; Buckle, J.; Beaulieu, S. F.; Berry, D. S.; Broekhoven-Fiene, H.; Currie, M. J.; Fich, M.; Hatchell, J.; Jenness, T.; Mottram, J. C.; Nutter, D.; Pattle, K.; Pineda, J. E.; Salji, C.; di Francesco, J.; Hogerheijde, M. R.; Ward-Thompson, D.; JCMT Gould Belt survey Team

    2015-12-01

    Performing ground-based submillimetre observations is a difficult task as the measurements are subject to absorption and emission from water vapour in the Earth's atmosphere and time variation in weather and instrument stability. Removing these features and other artefacts from the data is a vital process which affects the characteristics of the recovered astronomical structure we seek to study. In this paper, we explore two data reduction methods for data taken with the Submillimetre Common-User Bolometer Array-2 (SCUBA-2) at the James Clerk Maxwell Telescope (JCMT). The JCMT Legacy Reduction 1 (JCMT LR1) and The Gould Belt Legacy Survey Legacy Release 1 (GBS LR1) reduction both use the same software (STARLINK) but differ in their choice of data reduction parameters. We find that the JCMT LR1 reduction is suitable for determining whether or not compact emission is present in a given region and the GBS LR1 reduction is tuned in a robust way to uncover more extended emission, which better serves more in-depth physical analyses of star-forming regions. Using the GBS LR1 method, we find that compact sources are recovered well, even at a peak brightness of only three times the noise, whereas the reconstruction of larger objects requires much care when drawing boundaries around the expected astronomical signal in the data reduction process. Incorrect boundaries can lead to false structure identification or it can cause structure to be missed. In the JCMT LR1 reduction, the extent of the true structure of objects larger than a point source is never fully recovered.

  9. Alternative Fuels Data Center: County Fleet Goes Big on Idle Reduction,

    Science.gov Websites

    Ethanol Use, Fuel Efficiency County Fleet Goes Big on Idle Reduction, Ethanol Use, Fuel , Ethanol Use, Fuel Efficiency on Facebook Tweet about Alternative Fuels Data Center: County Fleet Goes Big on Idle Reduction, Ethanol Use, Fuel Efficiency on Twitter Bookmark Alternative Fuels Data Center

  10. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    NASA Technical Reports Server (NTRS)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  11. High-performance data processing using distributed computing on the SOLIS project

    NASA Astrophysics Data System (ADS)

    Wampler, Stephen

    2002-12-01

    The SOLIS solar telescope collects data at a high rate, resulting in 500 GB of raw data each day. The SOLIS Data Handling System (DHS) has been designed to quickly process this data down to 156 GB of reduced data. The DHS design uses pools of distributed reduction processes that are allocated to different observations as needed. A farm of 10 dual-cpu Linux boxes contains the pools of reduction processes. Control is through CORBA and data is stored on a fibre channel storage area network (SAN). Three other Linux boxes are responsible for pulling data from the instruments using SAN-based ringbuffers. Control applications are Java-based while the reduction processes are written in C++. This paper presents the overall design of the SOLIS DHS and provides details on the approach used to control the pooled reduction processes. The various strategies used to manage the high data rates are also covered.

  12. Reduction of hexavalent chromium by fasted and fed human gastric fluid. II. Ex vivo gastric reduction modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirman, Christopher R., E-mail: ckirman@summittoxi

    To extend previous models of hexavalent chromium [Cr(VI)] reduction by gastric fluid (GF), ex vivo experiments were conducted to address data gaps and limitations identified with respect to (1) GF dilution in the model; (2) reduction of Cr(VI) in fed human GF samples; (3) the number of Cr(VI) reduction pools present in human GF under fed, fasted, and proton pump inhibitor (PPI)-use conditions; and (4) an appropriate form for the pH-dependence of Cr(VI) reduction rate constants. Rates and capacities of Cr(VI) reduction were characterized in gastric contents from fed and fasted volunteers, and from fasted pre-operative patients treated with PPIs.more » Reduction capacities were first estimated over a 4-h reduction period. Once reduction capacity was established, a dual-spike approach was used in speciated isotope dilution mass spectrometry analyses to characterize the concentration-dependence of the 2nd order reduction rate constants. These data, when combined with previously collected data, were well described by a three-pool model (pool 1 = fast reaction with low capacity; pool 2 = slow reaction with higher capacity; pool 3 = very slow reaction with higher capacity) using pH-dependent rate constants characterized by a piecewise, log-linear relationship. These data indicate that human gastric samples, like those collected from rats and mice, contain multiple pools of reducing agents, and low concentrations of Cr(VI) (< 0.7 mg/L) are reduced more rapidly than high concentrations. The data and revised modeling results herein provide improved characterization of Cr(VI) gastric reduction kinetics, critical for Cr(VI) pharmacokinetic modeling and human health risk assessment. - Highlights: • SIDMS allows for measurement of Cr(VI) reduction rate in gastric fluid ex vivo • Human gastric fluid has three reducing pools • Cr(VI) in drinking water at < 0.7 mg/L is rapidly reduced in human gastric fluid • Reduction rate is concentration- and pH-dependent • A refined PK model is used to characterize inter-individual variability in Cr(VI) gastric reduction capacity.« less

  13. The SCUBA-2 SRO data reduction cookbook

    NASA Astrophysics Data System (ADS)

    Chapin, Edward; Dempsey, Jessica; Jenness, Tim; Scott, Douglas; Thomas, Holly; Tilanus, Remo P. J.

    This cookbook provides a short introduction to starlink\\ facilities, especially smurf, the Sub-Millimetre User Reduction Facility, for reducing and displaying SCUBA-2 SRO data. We describe some of the data artefacts present in SCUBA-2 time series and methods we employ to mitigate them. In particular, we illustrate the various steps required to reduce the data, and the Dynamic Iterative Map-Maker, which carries out all of these steps using a single command. For information on SCUBA-2 data reduction since SRO, please SC/21.

  14. The ORAC-DR data reduction pipeline

    NASA Astrophysics Data System (ADS)

    Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.

    2008-03-01

    The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.

  15. Data reductions and data quality for the high resolution spectrograph on the Southern African Large Telescope

    NASA Astrophysics Data System (ADS)

    Crawford, S. M.; Crause, Lisa; Depagne, Éric; Ilkiewicz, Krystian; Schroeder, Anja; Kuhn, Rudolph; Hettlage, Christian; Romero Colmenaro, Encarni; Kniazev, Alexei; Väisänen, Petri

    2016-08-01

    The High Resolution Spectrograph (HRS) on the Southern African Large Telescope (SALT) is a dual beam, fiber-fed echelle spectrograph providing high resolution capabilities to the SALT observing community. We describe the available data reduction tools and the procedures put in place for regular monitoring of the data quality from the spectrograph. Data reductions are carried out through the pyhrs package. The data characteristics and instrument stability are reported as part of the SALT Dashboard to help monitor the performance of the instrument.

  16. Generic Data Pipelining Using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Allan, Alasdair; Jenness, Tim; Economou, Frossie; Currie, Malcolm J.; Bly, Martin J.

    A generic data reduction pipeline is, perhaps, the holy grail for data reduction software. We present work which sets us firmly on the path towards this goal. ORAC-DR is an online data reduction pipeline written by the Joint Astronomy Center (JAC) and the UK Astronomy Technology Center (ATC) and distributed as part of the Starlink Software collection (SSC). It is intended to run with a minimum of observer interaction, and is able to handle data from many different instruments, including SCUBA, CGS4, UFTI, IRCAM and Michelle, with support for IRIS2 and UIST under development. Recent work by Starlink in collaboration with the JAC has resulted in an increase in the pipeline's flexibility, opening up the possibility that it could be used for truly generic data reduction for data from any imaging, and eventually spectroscopic, detector.

  17. Data Reduction of Jittered Infrared Images Using the ORAC Pipeline

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm; Wright, Gillian; Bridger, Alan; Economou, Frossie

    We relate our experiences using the ORAC data reduction pipeline for jittered images of stars and galaxies. The reduction recipes currently combine applications from several Starlink packages with intelligent Perl recipes to cater to UKIRT data. We describe the recipes and some of the algorithms used, and compare the quality of the resultant mosaics and photometry with the existing facilities.

  18. Present status of the 4-m ILMT data reduction pipeline: application to space debris detection and characterization

    NASA Astrophysics Data System (ADS)

    Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean

    2018-04-01

    The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.

  19. ORAC-DR -- SCUBA-2 Pipeline Data Reduction

    NASA Astrophysics Data System (ADS)

    Gibb, Andrew G.; Jenness, Tim

    The ORAC-DR data reduction pipeline is designed to reduce data from many different instruments. This document describes how to use ORAC-DR to process data taken with the SCUBA-2 instrument on the James Clerk Maxwell Telescope.

  20. Comparison of automated satellite systems with conventional systems for hydrologic data collection in west-central Florida

    USGS Publications Warehouse

    Woodham, W.M.

    1982-01-01

    This report provides results of reliability and cost-effective studies of the goes satellite data-collection system used to operate a small hydrologic data network in west-central Florida. The GOES system, in its present state of development, was found to be about as reliable as conventional methods of data collection. Benefits of using the GOES system include some cost and manpower reduction, improved data accuracy, near real-time data availability, and direct computer storage and analysis of data. The GOES system could allow annual manpower reductions of 19 to 23 percent with reduction in cost for some and increase in cost for other single-parameter sites, such as streamflow, rainfall, and ground-water monitoring stations. Manpower reductions of 46 percent or more appear possible for multiple-parameter sites. Implementation of expected improvements in instrumentation and data handling procedures should further reduce costs. (USGS)

  1. Dimension reduction techniques for the integrative analysis of multi-omics data

    PubMed Central

    Zeleznik, Oana A.; Thallinger, Gerhard G.; Kuster, Bernhard; Gholami, Amin M.

    2016-01-01

    State-of-the-art next-generation sequencing, transcriptomics, proteomics and other high-throughput ‘omics' technologies enable the efficient generation of large experimental data sets. These data may yield unprecedented knowledge about molecular pathways in cells and their role in disease. Dimension reduction approaches have been widely used in exploratory analysis of single omics data sets. This review will focus on dimension reduction approaches for simultaneous exploratory analyses of multiple data sets. These methods extract the linear relationships that best explain the correlated structure across data sets, the variability both within and between variables (or observations) and may highlight data issues such as batch effects or outliers. We explore dimension reduction techniques as one of the emerging approaches for data integration, and how these can be applied to increase our understanding of biological systems in normal physiological function and disease. PMID:26969681

  2. HIPPARCOS - Activities of the data reduction consortia

    NASA Astrophysics Data System (ADS)

    Lindegren, L.; Kovalevsky, J.

    The complete reduction of data from the ESA astrometry satellite Hipparcos, from some 1012bits of photon counts and ancillary data to a catalogue of astrometric parameters and magnitudes for the 100,000 programme stars, will be independently undertaken by two scientific consortia, NDAC and FAST. This approach is motivated by the size and complexity of the reductions and to ensure the validity of the results. The end product will be a single, agreed-upon catalogue. This paper describes briefly the principles of reduction and the organisation and status within each consortium.

  3. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  4. A randomized trial of pneumatic reduction versus hydrostatic reduction for intussusception in pediatric patients.

    PubMed

    Xie, Xiaolong; Wu, Yang; Wang, Qi; Zhao, Yiyang; Chen, Guobin; Xiang, Bo

    2017-08-08

    Data of randomly controlled trials comparing the hydrostatic and pneumatic reduction for intussusception in pediatric patients as initial therapy are lacking. The aim of this study was to conduct a randomly controlled trial to compare the effectiveness and safety of the hydrostatic and pneumatic reduction techniques. All intussusception patients who visited West China Hospital of Sichuan University from January 2014 to December 2015 were enrolled in this study in which they underwent pneumatic reduction or hydrostatic reduction. Patients were randomized into ultrasound-guided hydrostatic or X-ray-guided pneumatic reduction group. The data collected includes demographic data, symptoms, signs, and investigations. The primary outcome of the study was the success rate of reduction. And the secondary outcomes of the study were the rates of intestinal perforations and recurrence. A total of 124 children with intussusception who had met the inclusion criteria were enrolled. The overall success rate of this study was 90.32%. Univariable analysis showed that the success rate of hydrostatic reduction with normal saline (96.77%) was significantly higher than that of pneumatic reduction with air (83.87%) (p=0.015). Perforation after reduction was found in only one of the pneumatic reduction group. The recurrence rate of intussusception in the hydrostatic reduction group was 4.84% compared with 3.23% of pneumatic reduction group. Our study found that ultrasound-guided hydrostatic reduction is a simple, safe and effective nonoperative treatment for pediatric patients suffering from intussusceptions, and should be firstly adopted in the treatment of qualified patients. Therapeutic study TYPE OF STUDY: Prospective study. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. The Future of Data Reduction at UKIRT

    NASA Astrophysics Data System (ADS)

    Economou, F.; Bridger, A.; Wright, G. S.; Rees, N. P.; Jenness, T.

    The Observatory Reduction and Acquisition Control (ORAC) project is a comprehensive re-implementation of all existing instrument user interfaces and data handling software involved at the United Kingdom Infrared Telescope (UKIRT). This paper addresses the design of the data reduction part of the system. Our main aim is to provide data reduction facilities for the new generation of UKIRT instruments of a similar standard to our current software packages, which have enjoyed success because of their science-driven approach. Additionally we wish to use modern software techniques in order to produce a system that is portable, flexible and extensible so as to have modest maintenance requirements, both in the medium and the longer term.

  6. User Interface for the ESO Advanced Data Products Image Reduction Pipeline

    NASA Astrophysics Data System (ADS)

    Rité, C.; Delmotte, N.; Retzlaff, J.; Rosati, P.; Slijkhuis, R.; Vandame, B.

    2006-07-01

    The poster presents a friendly user interface for image reduction, totally written in Python and developed by the Advanced Data Products (ADP) group. The interface is a front-end to the ESO/MVM image reduction package, originally developed in the ESO Imaging Survey (EIS) project and used currently to reduce imaging data from several instruments such as WFI, ISAAC, SOFI and FORS1. As part of its scope, the interface produces high-level, VO-compliant, science images from raw data providing the astronomer with a complete monitoring system during the reduction, computing also statistical image properties for data quality assessment. The interface is meant to be used for VO services and it is free but un-maintained software and the intention of the authors is to share code and experience. The poster describes the interface architecture and current capabilities and give a description of the ESO/MVM engine for image reduction. The ESO/MVM engine should be released by the end of this year.

  7. ORAC-DR -- SCUBA Pipeline Data Reduction

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie

    ORAC-DR is a flexible data reduction pipeline designed to reduce data from many different instruments. This document describes how to use the ORAC-DR pipeline to reduce data taken with the Submillimetre Common-User Bolometer Array (SCUBA) obtained from the James Clerk Maxwell Telescope.

  8. ORAC-DR -- integral field spectroscopy data reduction

    NASA Astrophysics Data System (ADS)

    Todd, Stephen

    ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce integral field unit (IFU) data collected at the United Kingdom Infrared Telescope (UKIRT) with the UIST instrument.

  9. FPGA-based architecture for real-time data reduction of ultrasound signals.

    PubMed

    Soto-Cajiga, J A; Pedraza-Ortega, J C; Rubio-Gonzalez, C; Bandala-Sanchez, M; Romero-Troncoso, R de J

    2012-02-01

    This paper describes a novel method for on-line real-time data reduction of radiofrequency (RF) ultrasound signals. The approach is based on a field programmable gate array (FPGA) system intended mainly for steel thickness measurements. Ultrasound data reduction is desirable when: (1) direct measurements performed by an operator are not accessible; (2) it is required to store a considerable amount of data; (3) the application requires measuring at very high speeds; and (4) the physical space for the embedded hardware is limited. All the aforementioned scenarios can be present in applications such as pipeline inspection where data reduction is traditionally performed on-line using pipeline inspection gauges (PIG). The method proposed in this work consists of identifying and storing in real-time only the time of occurrence (TOO) and the maximum amplitude of each echo present in a given RF ultrasound signal. The method is tested with a dedicated immersion system where a significant data reduction with an average of 96.5% is achieved. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Conceptual design of a data reduction system

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A telemetry data processing system was defined of the Data Reduction. Data reduction activities in support of the developmental flights of the Space Shuttle were used as references against which requirements are assessed in general terms. A conceptual system design believed to offer significant throughput for the anticipated types of data reduction activities is presented. The design identifies the use of a large, intermediate data store as a key element in a complex of high speed, single purpose processors, each of which performs predesignated, repetitive operations on either raw or partially processed data. The recommended approach to implement the design concept is to adopt an established interface standard and rely heavily on mature or promising technologies which are considered main stream of the integrated circuit industry. The design system concept, is believed to be implementable without reliance on exotic devices and/or operational procedures. Numerical methods were employed to examine the feasibility of digital discrimination of FDM composite signals, and of eliminating line frequency noises in data measurements.

  11. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.; Bradbury, J.

    1975-01-01

    Progress is reviewed on the reduction and analysis of tornado data collected on analog tape. The strip chart recording of 7 tracks from all available analog data for quick look analysis is emphasized.

  12. Data volume reduction for imaging radar polarimetry

    NASA Technical Reports Server (NTRS)

    Zebker, Howard A. (Inventor); Held, Daniel N. (Inventor); Vanzyl, Jakob J. (Inventor); Dubois, Pascale C. (Inventor); Norikane, Lynne (Inventor)

    1988-01-01

    Two alternative methods are presented for digital reduction of synthetic aperture multipolarized radar data using scattering matrices, or using Stokes matrices, of four consecutive along-track pixels to produce averaged data for generating a synthetic polarization image.

  13. Data volume reduction for imaging radar polarimetry

    NASA Technical Reports Server (NTRS)

    Zebker, Howard A. (Inventor); Held, Daniel N. (Inventor); van Zul, Jakob J. (Inventor); Dubois, Pascale C. (Inventor); Norikane, Lynne (Inventor)

    1989-01-01

    Two alternative methods are disclosed for digital reduction of synthetic aperture multipolarized radar data using scattering matrices, or using Stokes matrices, of four consecutive along-track pixels to produce averaged data for generating a synthetic polarization image.

  14. The Optimum Dataset method - examples of the application

    NASA Astrophysics Data System (ADS)

    Błaszczak-Bąk, Wioleta; Sobieraj-Żłobińska, Anna; Wieczorek, Beata

    2018-01-01

    Data reduction is a procedure to decrease the dataset in order to make their analysis more effective and easier. Reduction of the dataset is an issue that requires proper planning, so after reduction it meets all the user's expectations. Evidently, it is better if the result is an optimal solution in terms of adopted criteria. Within reduction methods, which provide the optimal solution there is the Optimum Dataset method (OptD) proposed by Błaszczak-Bąk (2016). The paper presents the application of this method for different datasets from LiDAR and the possibility of using the method for various purposes of the study. The following reduced datasets were presented: (a) measurement of Sielska street in Olsztyn (Airbrone Laser Scanning data - ALS data), (b) measurement of the bas-relief that is on the building in Gdańsk (Terrestrial Laser Scanning data - TLS data), (c) dataset from Biebrza river measurment (TLS data).

  15. Infrared Spectroscopy Data Reduction with ORAC-DR

    NASA Astrophysics Data System (ADS)

    Economou, F.; Jenness, T.; Cavanagh, B.; Wright, G. S.; Bridger, A. B.; Kerr, T. H.; Hirst, P.; Adamson, A. J.

    ORAC-DR is a flexible and extensible data reduction pipeline suitable for both on-line and off-line use. Since its development it has been in use on-line at UKIRT for data from the infrared cameras UFTI and IRCAM and at JCMT for data from the sub-millimetre bolometer array SCUBA. We have now added a suite of on-line reduction recipes that produces publication quality (or nearly so) data from the CGS4 near-infrared spectrometer and the MICHELLE mid-infrared Echelle spectrometer. As an example, this paper briefly describes some pipeline features for one of the more commonly used observing modes.

  16. FIEStool: Automated data reduction for FIber-fed Echelle Spectrograph (FIES)

    NASA Astrophysics Data System (ADS)

    Stempels, Eric; Telting, John

    2017-08-01

    FIEStool automatically reduces data obtained with the FIber-fed Echelle Spectrograph (FIES) at the Nordic Optical Telescope, a high-resolution spectrograph available on a stand-by basis, while also allowing the basic properties of the reduction to be controlled in real time by the user. It provides a Graphical User Interface and offers bias subtraction, flat-fielding, scattered-light subtraction, and specialized reduction tasks from the external packages IRAF (ascl:9911.002) and NumArray. The core of FIEStool is instrument-independent; the software, written in Python, could with minor modifications also be used for automatic reduction of data from other instruments.

  17. Prediction With Dimension Reduction of Multiple Molecular Data Sources for Patient Survival.

    PubMed

    Kaplan, Adam; Lock, Eric F

    2017-01-01

    Predictive modeling from high-dimensional genomic data is often preceded by a dimension reduction step, such as principal component analysis (PCA). However, the application of PCA is not straightforward for multisource data, wherein multiple sources of 'omics data measure different but related biological components. In this article, we use recent advances in the dimension reduction of multisource data for predictive modeling. In particular, we apply exploratory results from Joint and Individual Variation Explained (JIVE), an extension of PCA for multisource data, for prediction of differing response types. We conduct illustrative simulations to illustrate the practical advantages and interpretability of our approach. As an application example, we consider predicting survival for patients with glioblastoma multiforme from 3 data sources measuring messenger RNA expression, microRNA expression, and DNA methylation. We also introduce a method to estimate JIVE scores for new samples that were not used in the initial dimension reduction and study its theoretical properties; this method is implemented in the R package R.JIVE on CRAN, in the function jive.predict.

  18. Reducing the Volume of NASA Earth-Science Data

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon; Braverman, Amy J.; Guillaume, Alexandre

    2010-01-01

    A computer program reduces data generated by NASA Earth-science missions into representative clusters characterized by centroids and membership information, thereby reducing the large volume of data to a level more amenable to analysis. The program effects an autonomous data-reduction/clustering process to produce a representative distribution and joint relationships of the data, without assuming a specific type of distribution and relationship and without resorting to domain-specific knowledge about the data. The program implements a combination of a data-reduction algorithm known as the entropy-constrained vector quantization (ECVQ) and an optimization algorithm known as the differential evolution (DE). The combination of algorithms generates the Pareto front of clustering solutions that presents the compromise between the quality of the reduced data and the degree of reduction. Similar prior data-reduction computer programs utilize only a clustering algorithm, the parameters of which are tuned manually by users. In the present program, autonomous optimization of the parameters by means of the DE supplants the manual tuning of the parameters. Thus, the program determines the best set of clustering solutions without human intervention.

  19. Reductions in indoor black carbon concentrations from improved biomass stoves in rural India.

    PubMed

    Patange, Omkar S; Ramanathan, Nithya; Rehman, I H; Tripathi, Sachi Nand; Misra, Amit; Kar, Abhishek; Graham, Eric; Singh, Lokendra; Bahadur, Ranjit; Ramanathan, V

    2015-04-07

    Deployment of improved biomass burning cookstoves is recognized as a black carbon (BC) mitigation measure that has the potential to achieve health benefits and climate cobenefits. Yet, few field based studies document BC concentration reductions (and resulting human exposure) resulting from improved stove usage. In this paper, data are presented from 277 real-world cooking sessions collected during two field studies to document the impacts on indoor BC concentrations inside village kitchens as a result of switching from traditional stoves to improved forced draft (FD) stoves. Data collection utilized new low-cost cellphone methods to monitor BC, cooking duration, and fuel consumption. A cross sectional study recorded a reduction of 36% in BC during cooking sessions. An independent paired sample study demonstrated a statistically significant reduction of 40% in 24 h BC concentrations when traditional stoves were replaced with FD stoves. Reductions observed in these field studies differ from emission factor reductions (up to 99%) observed under controlled conditions in laboratory studies. Other nonstove sources (e.g., kerosene lamps, ambient concentrations) likely offset the reductions. Health exposure studies should utilize reductions determined by field measurements inside village kitchens, in conjunction with laboratory data, to assess the health impacts of new cooking technologies.

  20. Nonlinear dimensionality reduction methods for synthetic biology biobricks' visualization.

    PubMed

    Yang, Jiaoyun; Wang, Haipeng; Ding, Huitong; An, Ning; Alterovitz, Gil

    2017-01-19

    Visualizing data by dimensionality reduction is an important strategy in Bioinformatics, which could help to discover hidden data properties and detect data quality issues, e.g. data noise, inappropriately labeled data, etc. As crowdsourcing-based synthetic biology databases face similar data quality issues, we propose to visualize biobricks to tackle them. However, existing dimensionality reduction methods could not be directly applied on biobricks datasets. Hereby, we use normalized edit distance to enhance dimensionality reduction methods, including Isomap and Laplacian Eigenmaps. By extracting biobricks from synthetic biology database Registry of Standard Biological Parts, six combinations of various types of biobricks are tested. The visualization graphs illustrate discriminated biobricks and inappropriately labeled biobricks. Clustering algorithm K-means is adopted to quantify the reduction results. The average clustering accuracy for Isomap and Laplacian Eigenmaps are 0.857 and 0.844, respectively. Besides, Laplacian Eigenmaps is 5 times faster than Isomap, and its visualization graph is more concentrated to discriminate biobricks. By combining normalized edit distance with Isomap and Laplacian Eigenmaps, synthetic biology biobircks are successfully visualized in two dimensional space. Various types of biobricks could be discriminated and inappropriately labeled biobricks could be determined, which could help to assess crowdsourcing-based synthetic biology databases' quality, and make biobricks selection.

  1. ANALYSIS AND REDUCTION OF LANDSAT DATA FOR USE IN A HIGH PLAINS GROUND-WATER FLOW MODEL.

    USGS Publications Warehouse

    Thelin, Gail; Gaydas, Leonard; Donovan, Walter; Mladinich, Carol

    1984-01-01

    Data obtained from 59 Landsat scenes were used to estimate the areal extent of irrigated agriculture over the High Plains region of the United States for a ground-water flow model. This model provides information on current trends in the amount and distribution of water used for irrigation. The analysis and reduction process required that each Landsat scene be ratioed, interpreted, and aggregated. Data reduction by aggregation was an efficient technique for handling the volume of data analyzed. This process bypassed problems inherent in geometrically correcting and mosaicking the data at pixel resolution and combined the individual Landsat classification into one comprehensive data set.

  2. Parallel, Real-Time and Pipeline Data Reduction for the ROVER Sub-mm Heterodyne Polarimeter on the JCMT with ACSIS and ORAC-DR

    NASA Astrophysics Data System (ADS)

    Leech, J.; Dewitt, S.; Jenness, T.; Greaves, J.; Lightfoot, J. F.

    2005-12-01

    ROVER is a rotating waveplate polarimeter for use with (sub)mm heterodyne instruments, particularly the 16 element focal plane Heterodyne Array Receiver HARP tep{Smit2003} due for commissioning on the JCMT in 2004. The ROVER/HARP back-end will be a digital auto-correlation spectrometer, known as ACSIS, designed specifically for the demanding data volumes from the HARP array receiver. ACSIS is being developed by DRAO, Penticton and UKATC. This paper will describe the data reduction of ROVER polarimetry data both in real-time by ACSIS-DR, and through the ORAC-DR data reduction pipeline.

  3. Waste Reduction Model (WARM) Material Descriptions and Data Sources

    EPA Pesticide Factsheets

    This page provides a summary of the materials included in EPA’s Waste Reduction Model (WARM). The page includes a list of materials, a description of the material as defined in the primary data source, and citations for primary data sources.

  4. Towards the automated reduction and calibration of SCUBA data from the James Clerk Maxwell Telescope

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Stevens, J. A.; Archibald, E. N.; Economou, F.; Jessop, N. E.; Robson, E. I.

    2002-10-01

    The Submillimetre Common User Bolometer Array (SCUBA) instrument has been operating on the James Clerk Maxwell Telescope (JCMT) since 1997. The data archive is now sufficiently large that it can be used to investigate instrumental properties and the variability of astronomical sources. This paper describes the automated calibration and reduction scheme used to process the archive data, with particular emphasis on `jiggle-map' observations of compact sources. We demonstrate the validity of our automated approach at both 850 and 450 μm, and apply it to several of the JCMT secondary flux calibrators. We determine light curves for the variable sources IRC +10216 and OH 231.8. This automation is made possible by using the ORAC-DR data reduction pipeline, a flexible and extensible data reduction pipeline that is used on the United Kingdom Infrared Telescope (UKIRT) and the JCMT.

  5. A Hybrid Data Compression Scheme for Power Reduction in Wireless Sensors for IoT.

    PubMed

    Deepu, Chacko John; Heng, Chun-Huat; Lian, Yong

    2017-04-01

    This paper presents a novel data compression and transmission scheme for power reduction in Internet-of-Things (IoT) enabled wireless sensors. In the proposed scheme, data is compressed with both lossy and lossless techniques, so as to enable hybrid transmission mode, support adaptive data rate selection and save power in wireless transmission. Applying the method to electrocardiogram (ECG), the data is first compressed using a lossy compression technique with a high compression ratio (CR). The residual error between the original data and the decompressed lossy data is preserved using entropy coding, enabling a lossless restoration of the original data when required. Average CR of 2.1 × and 7.8 × were achieved for lossless and lossy compression respectively with MIT/BIH database. The power reduction is demonstrated using a Bluetooth transceiver and is found to be reduced to 18% for lossy and 53% for lossless transmission respectively. Options for hybrid transmission mode, adaptive rate selection and system level power reduction make the proposed scheme attractive for IoT wireless sensors in healthcare applications.

  6. A Standard for Command, Control, Communications and Computers (C4) Test Data Representation to Integrate with High-Performance Data Reduction

    DTIC Science & Technology

    2015-06-01

    events was ad - hoc and problematic due to time constraints and changing requirements. Determining errors in context and heuristics required expertise...area code ) 410-278-4678 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of Figures iv 1. Introduction 1...reduction code ...........8 1 1. Introduction Data reduction for analysis of Command, Control, Communications, and Computer (C4) network tests

  7. Design and Implementation of Data Reduction Pipelines for the Keck Observatory Archive

    NASA Astrophysics Data System (ADS)

    Gelino, C. R.; Berriman, G. B.; Kong, M.; Laity, A. C.; Swain, M. A.; Campbell, R.; Goodrich, R. W.; Holt, J.; Lyke, J.; Mader, J. A.; Tran, H. D.; Barlow, T.

    2015-09-01

    The Keck Observatory Archive (KOA), a collaboration between the NASA Exoplanet Science Institute and the W. M. Keck Observatory, serves science and calibration data for all active and inactive instruments from the twin Keck Telescopes located near the summit of Mauna Kea, Hawaii. In addition to the raw data, we produce and provide quick look reduced data for four instruments (HIRES, LWS, NIRC2, and OSIRIS) so that KOA users can more easily assess the scientific content and the quality of the data, which can often be difficult with raw data. The reduced products derive from both publicly available data reduction packages (when available) and KOA-created reduction scripts. The automation of publicly available data reduction packages has the benefit of providing a good quality product without the additional time and expense of creating a new reduction package, and is easily applied to bulk processing needs. The downside is that the pipeline is not always able to create an ideal product, particularly for spectra, because the processing options for one type of target (eg., point sources) may not be appropriate for other types of targets (eg., extended galaxies and nebulae). In this poster we present the design and implementation for the current pipelines used at KOA and discuss our strategies for handling data for which the nature of the targets and the observers' scientific goals and data taking procedures are unknown. We also discuss our plans for implementing automated pipelines for the remaining six instruments.

  8. UniPOPS: Unified data reduction suite

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.; Garwood, Robert W.; Salter, Christopher J.; Stobie, Elizabeth B.; Cram, Thomas R.; Morgan, Lorrie; Vance, Bob; Hudson, Jerome

    2015-03-01

    UniPOPS, a suite of programs and utilities developed at the National Radio Astronomy Observatory (NRAO), reduced data from the observatory's single-dish telescopes: the Tucson 12-m, the Green Bank 140-ft, and archived data from the Green Bank 300-ft. The primary reduction programs, 'line' (for spectral-line reduction) and 'condar' (for continuum reduction), used the People-Oriented Parsing Service (POPS) as the command line interpreter. UniPOPS unified previous analysis packages and provided new capabilities; development of UniPOPS continued within the NRAO until 2004 when the 12-m was turned over to the Arizona Radio Observatory (ARO). The submitted code is version 3.5 from 2004, the last supported by the NRAO.

  9. nanopipe: Calibration and data reduction pipeline for pulsar timing

    NASA Astrophysics Data System (ADS)

    Demorest, Paul B.

    2018-03-01

    nanopipe is a data reduction pipeline for calibration, RFI removal, and pulse time-of-arrival measurement from radio pulsar data. It was developed primarily for use by the NANOGrav project. nanopipe is written in Python, and depends on the PSRCHIVE (ascl:1105.014) library.

  10. XRP -- SMM XRP Data Analysis & Reduction

    NASA Astrophysics Data System (ADS)

    McSherry, M.; Lawden, M. D.

    This manual describes the various programs that are available for the reduction and analysis of XRP data. These programs have been developed under the VAX operating system. The original programs are resident on a VaxStation 3100 at the Solar Data Analysis Center (NASA/GSFC Greenbelt MD).

  11. User's guide for ALEX: uncertainty propagation from raw data to final results for ORELA transmission measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1984-02-01

    This report describes a computer code (ALEX) developed to assist in AnaLysis of EXperimental data at the Oak Ridge Electron Linear Accelerator (ORELA). Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure; propagation of experimental uncertainties through that reduction procedure has in the past been viewed as even more difficult - if not impossible. The purpose of the code ALEX is to correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional inputmore » from the eperimentalist beyond that which is required for the data reduction itself. This report describes ALEX in detail, with special attention given to the case of transmission measurements (the code itself is applicable, with few changes, to any type of data). Application to the natural iron measurements of D.C. Larson et al. is described in some detail.« less

  12. A reduction package for cross-dispersed echelle spectrograph data in IDL

    NASA Astrophysics Data System (ADS)

    Hall, Jeffrey C.; Neff, James E.

    1992-12-01

    We have written in IDL a data reduction package that performs reduction and extraction of cross-dispersed echelle spectrograph data. The present package includes a complete set of tools for extracting data from any number of spectral orders with arbitrary tilt and curvature. Essential elements include debiasing and flatfielding of the raw CCD image, removal of scattered light background, either nonoptimal or optimal extraction of data, and wavelength calibration and continuum normalization of the extracted orders. A growing set of support routines permits examination of the frame being processed to provide continuing checks on the statistical properties of the data and on the accuracy of the extraction. We will display some sample reductions and discuss the algorithms used. The inherent simplicity and user-friendliness of the IDL interface make this package a useful tool for spectroscopists. We will provide an email distribution list for those interested in receiving the package, and further documentation will be distributed at the meeting.

  13. A Practical Platform for Blood Biomarker Study by Using Global Gene Expression Profiling of Peripheral Whole Blood

    PubMed Central

    Schmid, Patrick; Yao, Hui; Galdzicki, Michal; Berger, Bonnie; Wu, Erxi; Kohane, Isaac S.

    2009-01-01

    Background Although microarray technology has become the most common method for studying global gene expression, a plethora of technical factors across the experiment contribute to the variable of genome gene expression profiling using peripheral whole blood. A practical platform needs to be established in order to obtain reliable and reproducible data to meet clinical requirements for biomarker study. Methods and Findings We applied peripheral whole blood samples with globin reduction and performed genome-wide transcriptome analysis using Illumina BeadChips. Real-time PCR was subsequently used to evaluate the quality of array data and elucidate the mode in which hemoglobin interferes in gene expression profiling. We demonstrated that, when applied in the context of standard microarray processing procedures, globin reduction results in a consistent and significant increase in the quality of beadarray data. When compared to their pre-globin reduction counterparts, post-globin reduction samples show improved detection statistics, lowered variance and increased sensitivity. More importantly, gender gene separation is remarkably clearer in post-globin reduction samples than in pre-globin reduction samples. Our study suggests that the poor data obtained from pre-globin reduction samples is the result of the high concentration of hemoglobin derived from red blood cells either interfering with target mRNA binding or giving the pseudo binding background signal. Conclusion We therefore recommend the combination of performing globin mRNA reduction in peripheral whole blood samples and hybridizing on Illumina BeadChips as the practical approach for biomarker study. PMID:19381341

  14. Air Traffic Control Experimentation and Evaluation with the NASA ATS-6 Satellite : Volume 4. Data Reduction and Analysis Software.

    DOT National Transportation Integrated Search

    1976-09-01

    Software used for the reduction and analysis of the multipath prober, modem evaluation (voice, digital data, and ranging), and antenna evaluation data acquired during the ATS-6 field test program is described. Multipath algorithms include reformattin...

  15. Cure-WISE: HETDEX data reduction with Astro-WISE

    NASA Astrophysics Data System (ADS)

    Snigula, J. M.; Cornell, M. E.; Drory, N.; Fabricius, Max.; Landriau, M.; Hill, G. J.; Gebhardt, K.

    2012-09-01

    The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) is a blind spectroscopic survey to map the evolution of dark energy using Lyman-alpha emitting galaxies at redshifts 1:9 < z < 3:5 as tracers. The survey instrument, VIRUS, consists of 75 IFUs distributed across the 22-arcmin field of the upgraded 9.2-m HET. Each exposure gathers 33,600 spectra. Over the projected five year run of the survey we expect about 170 GB of data per night. For the data reduction we developed the Cure pipeline. Cure is designed to automatically find and calibrate the observed spectra, subtract the sky background, and detect and classify different types of sources. Cure employs rigorous statistical methods and complete pixel-level error propagation throughout the reduction process to ensure Poisson-limited performance and meaningful significance values. To automate the reduction of the whole dataset we implemented the Cure pipeline in the Astro-WISE framework. This integration provides for HETDEX a database backend with complete dependency tracking of the various reduction steps, automated checks, and a searchable interface to the detected sources and user management. It can be used to create various web interfaces for data access and quality control. Astro-WISE allows us to reduce the data from all the IFUs in parallel on a compute cluster. This cluster allows us to reduce the observed data in quasi real time and still have excess capacity for rerunning parts of the reduction. Finally, the Astro-WISE interface will be used to provide access to reduced data products to the general community.

  16. Uncertainty propagation from raw data to final results. [ALEX

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, N.M.

    1985-01-01

    Reduction of data from raw numbers (counts per channel) to physically meaningful quantities (such as cross sections) is in itself a complicated procedure. Propagation of experimental uncertainties through that reduction process has sometimes been perceived as even more difficult, if not impossible. At the Oak Ridge Electron Linear Accelerator, a computer code ALEX has been developed to assist in the propagation process. The purpose of ALEX is to carefully and correctly propagate all experimental uncertainties through the entire reduction procedure, yielding the complete covariance matrix for the reduced data, while requiring little additional input from the experimentalist beyond that whichmore » is needed for the data reduction itself. The theoretical method used in ALEX is described, with emphasis on transmission measurements. Application to the natural iron and natural nickel measurements of D.C. Larson is shown.« less

  17. 48 CFR 52.215-10 - Price Reduction for Defective Certified Cost or Pricing Data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Defective Certified Cost or Pricing Data. 52.215-10 Section 52.215-10 Federal Acquisition Regulations System... Text of Provisions and Clauses 52.215-10 Price Reduction for Defective Certified Cost or Pricing Data... or Pricing Data (OCT 2010) (a) If any price, including profit or fee, negotiated in connection with...

  18. Patterns of shading tolerance determined from experimental ...

    EPA Pesticide Factsheets

    An extensive review of the experimental literature on seagrass shading evaluated the relationship between experimental light reductions, duration of experiment and seagrass response metrics to determine whether there were consistent statistical patterns. There were highly significant linear relationships of both percent biomass and percent shoot density reduction versus percent light reduction (versus controls), although unexplained variation in the data were high. Duration of exposure affected extent of response for both metrics, but was more clearly a factor in biomass response. Both biomass and shoot density showed linear responses to duration of light reduction for treatments 60%. Unexplained variation was again high, and greater for shoot density than biomass. With few exceptions, regressions of both biomass and shoot density on light reduction for individual species and for genera were statistically significant, but also tended to show high degrees of variability in data. Multivariate regressions that included both percent light reduction and duration of reduction as dependent variables increased the percentage of variation explained in almost every case. Analysis of response data by seagrass life history category (Colonizing, Opportunistic, Persistent) did not yield clearly separate response relationships in most cases. Biomass tended to show somewhat less variation in response to light reduction than shoot density, and of the two, may be the prefe

  19. Reduction of Poisson noise in measured time-resolved data for time-domain diffuse optical tomography.

    PubMed

    Okawa, S; Endo, Y; Hoshi, Y; Yamada, Y

    2012-01-01

    A method to reduce noise for time-domain diffuse optical tomography (DOT) is proposed. Poisson noise which contaminates time-resolved photon counting data is reduced by use of maximum a posteriori estimation. The noise-free data are modeled as a Markov random process, and the measured time-resolved data are assumed as Poisson distributed random variables. The posterior probability of the occurrence of the noise-free data is formulated. By maximizing the probability, the noise-free data are estimated, and the Poisson noise is reduced as a result. The performances of the Poisson noise reduction are demonstrated in some experiments of the image reconstruction of time-domain DOT. In simulations, the proposed method reduces the relative error between the noise-free and noisy data to about one thirtieth, and the reconstructed DOT image was smoothed by the proposed noise reduction. The variance of the reconstructed absorption coefficients decreased by 22% in a phantom experiment. The quality of DOT, which can be applied to breast cancer screening etc., is improved by the proposed noise reduction.

  20. Data reduction complex analog-to-digital data processing requirements for onsite test facilities

    NASA Technical Reports Server (NTRS)

    Debbrecht, J. D.

    1976-01-01

    The analog to digital processing requirements of onsite test facilities are described. The source and medium of all input data to the Data Reduction Complex (DRC) and the destination and medium of all output products of the analog-to-digital processing are identified. Additionally, preliminary input and output data formats are presented along with the planned use of the output products.

  1. Consequences of data reduction in the FIA database: a case study with southern yellow pine

    Treesearch

    Anita K. Rose; James F. Rosson Jr.; Helen Beresford

    2015-01-01

    The Forest Inventory and Analysis Program strives to make its data publicly available in a format that is easy to use and understand most commonly accessed through online tools such as EVALIDator and Forest Inventory Data Online. This requires a certain amount of data reduction. Using a common data request concerning the resource of southern yellow pine (SYP), we...

  2. Target oriented dimensionality reduction of hyperspectral data by Kernel Fukunaga-Koontz Transform

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Ochilov, Shuhrat; Alam, Mohammad S.; Bal, Abdullah

    2017-02-01

    Principal component analysis (PCA) is a popular technique in remote sensing for dimensionality reduction. While PCA is suitable for data compression, it is not necessarily an optimal technique for feature extraction, particularly when the features are exploited in supervised learning applications (Cheriyadat and Bruce, 2003) [1]. Preserving features belonging to the target is very crucial to the performance of target detection/recognition techniques. Fukunaga-Koontz Transform (FKT) based supervised band reduction technique can be used to provide this requirement. FKT achieves feature selection by transforming into a new space in where feature classes have complimentary eigenvectors. Analysis of these eigenvectors under two classes, target and background clutter, can be utilized for target oriented band reduction since each basis functions best represent target class while carrying least information of the background class. By selecting few eigenvectors which are the most relevant to the target class, dimension of hyperspectral data can be reduced and thus, it presents significant advantages for near real time target detection applications. The nonlinear properties of the data can be extracted by kernel approach which provides better target features. Thus, we propose constructing kernel FKT (KFKT) to present target oriented band reduction. The performance of the proposed KFKT based target oriented dimensionality reduction algorithm has been tested employing two real-world hyperspectral data and results have been reported consequently.

  3. Spectral Data Reduction via Wavelet Decomposition

    NASA Technical Reports Server (NTRS)

    Kaewpijit, S.; LeMoigne, J.; El-Ghazawi, T.; Rood, Richard (Technical Monitor)

    2002-01-01

    The greatest advantage gained from hyperspectral imagery is that narrow spectral features can be used to give more information about materials than was previously possible with broad-band multispectral imagery. For many applications, the new larger data volumes from such hyperspectral sensors, however, present a challenge for traditional processing techniques. For example, the actual identification of each ground surface pixel by its corresponding reflecting spectral signature is still one of the most difficult challenges in the exploitation of this advanced technology, because of the immense volume of data collected. Therefore, conventional classification methods require a preprocessing step of dimension reduction to conquer the so-called "curse of dimensionality." Spectral data reduction using wavelet decomposition could be useful, as it does not only reduce the data volume, but also preserves the distinctions between spectral signatures. This characteristic is related to the intrinsic property of wavelet transforms that preserves high- and low-frequency features during the signal decomposition, therefore preserving peaks and valleys found in typical spectra. When comparing to the most widespread dimension reduction technique, the Principal Component Analysis (PCA), and looking at the same level of compression rate, we show that Wavelet Reduction yields better classification accuracy, for hyperspectral data processed with a conventional supervised classification such as a maximum likelihood method.

  4. Final Report for Geometric Analysis for Data Reduction and Structure Discovery DE-FG02-10ER25983, STRIPES award # DE-SC0004096

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vixie, Kevin R.

    This is the final report for the project "Geometric Analysis for Data Reduction and Structure Discovery" in which insights and tools from geometric analysis were developed and exploited for their potential to large scale data challenges.

  5. Alternative Fuels Data Center: Idle Reduction

    Science.gov Websites

    Cities Annual Petroleum Savings Clean Cities Annual Petroleum Savings Incentive and Law Additions by Fuel /Technology Type Incentive and Law Additions by Fuel/Technology Type Incentive Additions by Policy Type Incentive Additions by Policy Type More Idle Reduction Data | All Maps & Data Case Studies Massachusetts

  6. 48 CFR 52.215-11 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... accordingly and the contract shall be modified to reflect the reduction. This right to a price reduction is... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Price Reduction for... CONTRACT CLAUSES Text of Provisions and Clauses 52.215-11 Price Reduction for Defective Certified Cost or...

  7. 78 FR 57293 - Medicaid Program; State Disproportionate Share Hospital Allotment Reductions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-18

    ... reductions are prospective, not retrospective. Comment: One commenter requested clarification on how the... establish prospective DSH allotment reductions adjustments that rely on final or completed data from previous years. Response: The final rule establishes prospective DSH allotment reductions based on the most...

  8. Real-time data reduction capabilities at the Langley 7 by 10 foot high speed tunnel

    NASA Technical Reports Server (NTRS)

    Fox, C. H., Jr.

    1980-01-01

    The 7 by 10 foot high speed tunnel performs a wide range of tests employing a variety of model installation methods. To support the reduction of static data from this facility, a generalized wind tunnel data reduction program had been developed for use on the Langley central computer complex. The capabilities of a version of this generalized program adapted for real time use on a dedicated on-site computer are discussed. The input specifications, instructions for the console operator, and full descriptions of the algorithms are included.

  9. Fast Reduction Method in Dominance-Based Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yan; Zhou, Qinghua; Wen, Yongchuan

    2018-01-01

    In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.

  10. ORAC-DR: Astronomy data reduction pipeline

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie; Cavanagh, Brad; Currie, Malcolm J.; Gibb, Andy

    2013-10-01

    ORAC-DR is a generic data reduction pipeline infrastructure; it includes specific data processing recipes for a number of instruments. It is used at the James Clerk Maxwell Telescope, United Kingdom Infrared Telescope, AAT, and LCOGT. This pipeline runs at the JCMT Science Archive hosted by CADC to generate near-publication quality data products; the code has been in use since 1998.

  11. Knowledge-Based Decision Support in Department of Defense Acquisitions

    DTIC Science & Technology

    2010-09-01

    from the analysis framework developed by Miles and Huberman (1994). The framework describes the major phases of data analysis as data reduction, data... Miles and Huberman , 1994) Survey Effort For this research effort, the survey data was obtained from SAF/ACPO (Air Force Acquisition Chief...rank O-6/GS-15 or above. Data Reduction and Content Analysis Within the Miles and Huberman (1994) framework, the researcher used Microsoft

  12. PISCES High Contrast Integral Field Spectrograph Simulations and Data Reduction Pipeline

    NASA Technical Reports Server (NTRS)

    Llop Sayson, Jorge Domingo; Memarsadeghi, Nargess; McElwain, Michael W.; Gong, Qian; Perrin, Marshall; Brandt, Timothy; Grammer, Bryan; Greeley, Bradford; Hilton, George; Marx, Catherine

    2015-01-01

    The PISCES (Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies) is a lenslet array based integral field spectrograph (IFS) designed to advance the technology readiness of the WFIRST (Wide Field Infrared Survey Telescope)-AFTA (Astrophysics Focused Telescope Assets) high contrast Coronagraph Instrument. We present the end to end optical simulator and plans for the data reduction pipeline (DRP). The optical simulator was created with a combination of the IDL (Interactive Data Language)-based PROPER (optical propagation) library and Zemax (a MatLab script), while the data reduction pipeline is a modified version of the Gemini Planet Imager's (GPI) IDL pipeline. The simulations of the propagation of light through the instrument are based on Fourier transform algorithms. The DRP enables transformation of the PISCES IFS data to calibrated spectral data cubes.

  13. Cure-WISE: HETDEX Data Reduction with Astro-WISE

    NASA Astrophysics Data System (ADS)

    Snigula, J. M.; Drory, N.; Fabricius, M.; Landriau, M.; Montesano, F.; Hill, G. J.; Gebhardt, K.; Cornell, M. E.

    2014-05-01

    The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX, Hill et al. 2012b) is a blind spectroscopic survey to map the evolution of dark energy using Lyman-alpha emitting galaxies at redshifts 1.9< ɀ <3.5 as tracers. The survey will use an array of 75 integral field spectrographs called the Visible Integral field Replicable Unit (IFU) Spectrograph (VIRUS, Hill et al. 2012c). The 10m HET (Ramsey et al. 1998) currently receives a wide-field upgrade (Hill et al. 2012a) to accomodate the spectrographs and to provide the needed field of view. Over the projected five year run of the survey we expect to obtain approximately 170 GB of data each night. For the data reduction we developed the Cure pipeline, to automatically find and calibrate the observed spectra, subtract the sky background, and detect and classify different types of sources. Cure employs rigorous statistical methods and complete pixel-level error propagation throughout the reduction process to ensure Poisson-limited performance and meaningful significance values. To automate the reduction of the whole dataset we implemented the Cure pipeline in the Astro-WISE framework. This integration provides for HETDEX a database backend with complete dependency tracking of the various reduction steps, automated checks, and a searchable interface to the detected sources and user management. It can be used to create various web interfaces for data access and quality control. Astro-WISE allows us to reduce the data from all the IFUs in parallel on a compute cluster. This cluster allows us to reduce the observed data in quasi real time and still have excess capacity for rerunning parts of the reduction. Finally, the Astro-WISE interface will be used to provide access to reduced data products to the general community.

  14. ORAC-DR -- Programmer's Guide

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie; Cavanagh, Brad

    ORAC-DR is a general purpose automatic data reduction pipeline environment. This document describes how to modify data reduction recipes and how to add new instruments. For a general overview of ORAC-DR see SUN/230. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.

  15. An object-oriented data reduction system in Fortran

    NASA Technical Reports Server (NTRS)

    Bailey, J.

    1992-01-01

    A data reduction system for the AAO two-degree field project is being developed using an object-oriented approach. Rather than use an object-oriented language (such as C++) the system is written in Fortran and makes extensive use of existing subroutine libraries provided by the UK Starlink project. Objects are created using the extensible N-dimensional Data Format (NDF) which itself is based on the Hierarchical Data System (HDS). The software consists of a class library, with each class corresponding to a Fortran subroutine with a standard calling sequence. The methods of the classes provide operations on NDF objects at a similar level of functionality to the applications of conventional data reduction systems. However, because they are provided as callable subroutines, they can be used as building blocks for more specialist applications. The class library is not dependent on a particular software environment thought it can be used effectively in ADAM applications. It can also be used from standalone Fortran programs. It is intended to develop a graphical user interface for use with the class library to form the 2dF data reduction system.

  16. The Mechanisms of Oxygen Reduction in the Terminal Reducing Segment of the Chloroplast Photosynthetic Electron Transport Chain.

    PubMed

    Kozuleva, Marina A; Ivanov, Boris N

    2016-07-01

    The review is dedicated to ascertainment of the roles of the electron transfer cofactors of the pigment-protein complex of PSI, ferredoxin (Fd) and ferredoxin-NADP reductase in oxygen reduction in the photosynthetic electron transport chain (PETC) in the light. The data regarding oxygen reduction in other segments of the PETC are briefly analyzed, and it is concluded that their participation in the overall process in the PETC under unstressful conditions should be insignificant. Data concerning the contribution of Fd to the oxygen reduction in the PETC are examined. A set of collateral evidence as well as results of direct measurements of the involvement of Fd in this process in the presence of isolated thylakoids led to the inference that this contribution in vivo is negligible. The increase in oxygen reduction rate in the isolated thylakoids in the presence of either Fd or Fd plus NADP + under increasing light intensity was attributed to the increase in oxygen reduction executed by the membrane-bound oxygen reductants. Data are presented which imply that a main reductant of the O 2 molecule in the terminal reducing segment of the PETC is the electron transfer cofactor of PSI, phylloquinone. The physiological significance of characteristic properties of oxygen reductants in this segment of the PETC is discussed. © The Author 2016. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  17. GUIs in the MIDAS environment

    NASA Technical Reports Server (NTRS)

    Ballester, P.

    1992-01-01

    MIDAS (Munich Image Data Analysis System) is the image processing system developed at ESO for astronomical data reduction. MIDAS is used for off-line data reduction at ESO and many astronomical institutes all over Europe. In addition to a set of general commands, enabling to process and analyze images, catalogs, graphics and tables, MIDAS includes specialized packages dedicated to astronomical applications or to specific ESO instruments. Several graphical interfaces are available in the MIDAS environment: XHelp provides an interactive help facility, and XLong and XEchelle enable data reduction of long-slip and echelle spectra. GUI builders facilitate the development of interfaces. All ESO interfaces comply to the ESO User Interfaces Common Conventions which secures an identical look and feel for telescope operations, data analysis, and archives.

  18. Effective dimension reduction for sparse functional data

    PubMed Central

    YAO, F.; LEI, E.; WU, Y.

    2015-01-01

    Summary We propose a method of effective dimension reduction for functional data, emphasizing the sparse design where one observes only a few noisy and irregular measurements for some or all of the subjects. The proposed method borrows strength across the entire sample and provides a way to characterize the effective dimension reduction space, via functional cumulative slicing. Our theoretical study reveals a bias-variance trade-off associated with the regularizing truncation and decaying structures of the predictor process and the effective dimension reduction space. A simulation study and an application illustrate the superior finite-sample performance of the method. PMID:26566293

  19. An Air Quality Data Analysis System for Interrelating Effects, Standards and Needed Source Reductions

    ERIC Educational Resources Information Center

    Larsen, Ralph I.

    1973-01-01

    Makes recommendations for a single air quality data system (using average time) for interrelating air pollution effects, air quality standards, air quality monitoring, diffusion calculations, source-reduction calculations, and emission standards. (JR)

  20. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1976-01-01

    Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.

  1. A Preliminary Flight Investigation of Formation Flight for Drag Reduction on the C-17 Aircraft

    NASA Technical Reports Server (NTRS)

    Pahle, Joe; Berger, Dave; Venti, Michael W.; Faber, James J.; Duggan, Chris; Cardinal, Kyle

    2012-01-01

    Many theoretical and experimental studies have shown that aircraft flying in formation could experience significant reductions in fuel use compared to solo flight. To date, formation flight for aerodynamic benefit has not been thoroughly explored in flight for large transport-class vehicles. This paper summarizes flight data gathered during several two ship, C-17 formation flights at a single flight condition of 275 knots, at 25,000 ft MSL. Stabilized test points were flown with the trail aircraft at 1,000 and 3,000 ft aft of the lead aircraft at selected crosstrack and vertical offset locations within the estimated area of influence of the vortex generated by the lead aircraft. Flight data recorded at test points within the vortex from the lead aircraft are compared to data recorded at tare flight test points outside of the influence of the vortex. Since drag was not measured directly, reductions in fuel flow and thrust for level flight are used as a proxy for drag reduction. Estimated thrust and measured fuel flow reductions were documented at several trail test point locations within the area of influence of the leads vortex. The maximum average fuel flow reduction was approximately 7-8%, compared to the tare points flown before and after the test points. Although incomplete, the data suggests that regions with fuel flow and thrust reduction greater than 10% compared to the tare test points exist within the vortex area of influence.

  2. Historic Landslide Data Combined with Sentinel Satellite Data to Improve Modelling for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Bye, B. L.; Kontoes, C.; Catarino, N.; De Lathouwer, B.; Concalves, P.; Meyer-Arnek, J.; Mueller, A.; Kraft, C.; Grosso, N.; Goor, E.; Voidrot, M. F.; Trypitsidis, A.

    2017-12-01

    Landslides are geohazards potentially resulting in disasters. Landslides both vary enormously in their distribution in space and time. The surface deformation varies considerably from one type of instability to another. Individual ground instabilities may have a common trigger (extreme rainfall, earthquake), and therefore occur alongside many equivalent occurrences over a large area. This means that they can have a significant regional impact demanding national and international disaster risk reduction strategies. Regional impacts require collaboration across boarders as reflected in The Sendai Framework for Disaster Risk Reduction (2015-2030). The data demands related to the SDGs are unprecedented, another factor that will require coordinated efforts at the global, regional and national levels. Data of good quality are vital for governments, international organizations, civil society, the private sector and the general public in order to make informed decisions, included for disaster risk reduction. The NextGEOSS project evolves the European vision of a user driven GEOSS data exploitation for innovation and business, relying on 3 main pillars; engaging communities of practice, delivering technological advancements, and advocating the use of GEOSS. These 3 pillars support the creation and deployment of Earth observation based innovative research activities and commercial services. In this presentation we will explain how one of the 10 NextGEOSS pilots, Disaster Risk Reduction (DRR), plan to provide an enhanced multi-hazard risk assessment framework based on statistical analysis of long time series of data. Landslide events monitoring and landslides susceptibility estimation will be emphazised. Workflows will be based on models developed in the context of the Copernicus Emergency Management Service. Data envisaged to be used are: Radar SAR data; Yearly ground deformation/velocities; Historic landslide inventory; data related to topographic, geological, hydrological, geomorphological settings and ground observations from field trips. The development of NextGEOSS pilots opens up for interactions with international communities. Contributions from communities engaged in SDG activities and the implementation of the Sendai Framework for Disaster Risk Reduction are welcome

  3. Subject order-independent group ICA (SOI-GICA) for functional MRI data analysis.

    PubMed

    Zhang, Han; Zuo, Xi-Nian; Ma, Shuang-Ye; Zang, Yu-Feng; Milham, Michael P; Zhu, Chao-Zhe

    2010-07-15

    Independent component analysis (ICA) is a data-driven approach to study functional magnetic resonance imaging (fMRI) data. Particularly, for group analysis on multiple subjects, temporally concatenation group ICA (TC-GICA) is intensively used. However, due to the usually limited computational capability, data reduction with principal component analysis (PCA: a standard preprocessing step of ICA decomposition) is difficult to achieve for a large dataset. To overcome this, TC-GICA employs multiple-stage PCA data reduction. Such multiple-stage PCA data reduction, however, leads to variable outputs due to different subject concatenation orders. Consequently, the ICA algorithm uses the variable multiple-stage PCA outputs and generates variable decompositions. In this study, a rigorous theoretical analysis was conducted to prove the existence of such variability. Simulated and real fMRI experiments were used to demonstrate the subject-order-induced variability of TC-GICA results using multiple PCA data reductions. To solve this problem, we propose a new subject order-independent group ICA (SOI-GICA). Both simulated and real fMRI data experiments demonstrated the high robustness and accuracy of the SOI-GICA results compared to those of traditional TC-GICA. Accordingly, we recommend SOI-GICA for group ICA-based fMRI studies, especially those with large data sets. Copyright 2010 Elsevier Inc. All rights reserved.

  4. Moving base Gravity Gradiometer Survey System (GGSS) program

    NASA Astrophysics Data System (ADS)

    Pfohl, Louis; Rusnak, Walter; Jircitano, Albert; Grierson, Andrew

    1988-04-01

    The GGSS program began in early 1983 with the objective of delivering a landmobile and airborne system capable of fast, accurate, and economical gravity gradient surveys of large areas anywhere in the world. The objective included the development and use of post-mission data reduction software to process the survey data into solutions for the gravity disturbance vector components (north, east and vertical). This document describes the GGSS equipment hardware and software, integration and lab test procedures and results, and airborne and land survey procedures and results. Included are discussions on test strategies, post-mission data reduction algorithms, and the data reduction processing experience. Perspectives and conclusions are drawn from the results.

  5. Reduction procedures for accurate analysis of MSX surveillance experiment data

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. Mike; Lane, Mark T.; Abbot, Rick I.

    1994-01-01

    Technical challenges of the Midcourse Space Experiment (MSX) science instruments require careful characterization and calibration of these sensors for analysis of surveillance experiment data. Procedures for reduction of Resident Space Object (RSO) detections will be presented which include refinement and calibration of the metric and radiometric (and photometric) data and calculation of a precise MSX ephemeris. Examples will be given which support the reduction, and these are taken from ground-test data similar in characteristics to the MSX sensors and from the IRAS satellite RSO detections. Examples to demonstrate the calculation of a precise ephemeris will be provided from satellites in similar orbits which are equipped with S-band transponders.

  6. Waste Reduction Model (WARM) Material Descriptions and ...

    EPA Pesticide Factsheets

    2017-02-14

    This page provides a summary of the materials included in EPA’s Waste Reduction Model (WARM). The page includes a list of materials, a description of the material as defined in the primary data source, and citations for primary data sources.

  7. 48 CFR 1615.407-1 - Rate reduction for defective pricing or defective cost or pricing data.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... defective pricing or defective cost or pricing data. 1615.407-1 Section 1615.407-1 Federal Acquisition... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1615.407-1 Rate reduction for defective pricing or defective cost or pricing data. The clause set forth in section 1652.215-70...

  8. 48 CFR 1615.407-1 - Rate reduction for defective pricing or defective cost or pricing data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... defective pricing or defective cost or pricing data. 1615.407-1 Section 1615.407-1 Federal Acquisition... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1615.407-1 Rate reduction for defective pricing or defective cost or pricing data. The clause set forth in section 1652.215-70...

  9. 48 CFR 1615.407-1 - Rate reduction for defective pricing or defective cost or pricing data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... defective pricing or defective cost or pricing data. 1615.407-1 Section 1615.407-1 Federal Acquisition... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1615.407-1 Rate reduction for defective pricing or defective cost or pricing data. The clause set forth in section 1652.215-70...

  10. 48 CFR 1615.407-1 - Rate reduction for defective pricing or defective cost or pricing data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... defective pricing or defective cost or pricing data. 1615.407-1 Section 1615.407-1 Federal Acquisition... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 1615.407-1 Rate reduction for defective pricing or defective cost or pricing data. The clause set forth in section 1652.215-70...

  11. Software Products for Temperature Data Reduction of Platinum Resistance Thermometers (PRT)

    NASA Technical Reports Server (NTRS)

    Sherrod, Jerry K.

    1998-01-01

    The main objective of this project is to create user-friendly personal computer (PC) software for reduction/analysis of platinum resistance thermometer (PRT) data. Software products were designed and created to help users of PRT data with the tasks of using the Callendar-Van Dusen method. Sample runs are illustrated in this report.

  12. Effectiveness of Commercially Available Home Water Purification Systems for Removing Organic Contaminants.

    DTIC Science & Technology

    1986-09-01

    DEVELOPMENT OF DATA REGARDING REMOVAL CAPABILITIES OF HOME WATER TREATMENT UNITS .... 13 C. THE AMWAY WATER TREATMENT SYSTEM .. ......... . 14 D. UPDATE...10 4 Range of Percentage Reduction for Specific Halogenated Organics . . . ..................... 115 Amway Data for...Water-Soluble Organics. . ......... 17 6 Amway Data for Water-Insoluble Organics .......... .. 21 7 Percent Reduction Efficiencies .............. 25

  13. Residual acceleration data on IML-1: Development of a data reduction and dissemination plan

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.

    1993-01-01

    The research performed consisted of three stages: (1) identification of sensitive IML-1 experiments and sensitivity ranges by order of magnitude estimates, numerical modeling, and investigator input; (2) research and development towards reduction, supplementation, and dissemination of residual acceleration data; and (3) implementation of the plan on existing acceleration databases.

  14. 48 CFR 52.214-27 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications-Sealed Bidding.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Price Reduction for... PROVISIONS AND CONTRACT CLAUSES Text of Provisions and Clauses 52.214-27 Price Reduction for Defective... following clause: Price Reduction for Defective Certified Cost or Pricing Data—Modifications—Sealed Bidding...

  15. Moab, Utah: Using Energy Data to Target Carbon Reductions from Building Energy Efficiency (City Energy: From Data to Decisions)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strategic Priorities and Impact Analysis Team, Office of Strategic Programs

    This fact sheet "Moab, Utah: Using Energy Data to Target Carbon Reductions from Building Energy Efficiency" explains how the City of Moab used data from the U.S. Department of Energy's Cities Leading through Energy Analysis and Planning (Cities-LEAP) and the State and Local Energy Data (SLED) programs to inform its city energy planning. It is one of ten fact sheets in the "City Energy: From Data to Decisions" series.

  16. Hypersonic research engine project. Phase 2: Aerothermodynamic Integration Model (AIM) data reduction computer program, data item no. 54.16

    NASA Technical Reports Server (NTRS)

    Gaede, A. E.; Platte, W. (Editor)

    1975-01-01

    The data reduction program used to analyze the performance of the Aerothermodynamic Integration Model is described. Routines to acquire, calibrate, and interpolate the test data, to calculate the axial components of the pressure area integrals and the skin function coefficients, and to report the raw data in engineering units are included along with routines to calculate flow conditions in the wind tunnel, inlet, combustor, and nozzle, and the overall engine performance. Various subroutines were modified and used to obtain species concentrations and transport properties in chemical equilibrium at each of the internal and external engine stations. It is recommended that future test plans include the configuration, calibration, and channel assignment data on a magnetic tape generated at the test site immediately before or after a test, and that the data reduction program be designed to operate in a batch environment.

  17. 48 CFR 52.215-11 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... affirmative action to bring the character of the data to the attention of the Contracting Officer. (iii) The... price reduction, the Contractor shall be liable to and shall pay the United States at the time such...

  18. 48 CFR 52.215-11 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... affirmative action to bring the character of the data to the attention of the Contracting Officer. (iii) The... price reduction, the Contractor shall be liable to and shall pay the United States at the time such...

  19. 48 CFR 52.215-11 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... affirmative action to bring the character of the data to the attention of the Contracting Officer. (iii) The... price reduction, the Contractor shall be liable to and shall pay the United States at the time such...

  20. National Emphysema Treatment Trial redux: accentuating the positive.

    PubMed

    Sanchez, Pablo Gerardo; Kucharczuk, John Charles; Su, Stacey; Kaiser, Larry Robert; Cooper, Joel David

    2010-09-01

    Under the Freedom of Information Act, we obtained the follow-up data of the National Emphysema Treatment Trial (NETT) to determine the long-term outcome for "a heterogeneous distribution of emphysema with upper lobe predominance," postulated by the NETT hypothesis to be optimal candidates for lung volume reduction surgery. Using the NETT database, we identified patients with heterogeneous distribution of emphysema with upper lobe predominance and analyzed for the first time follow-up data for those receiving lung volume reduction surgery and those receiving medical management. Furthermore, we compared the results of the NETT reduction surgery group with a previously reported consecutive case series of 250 patients undergoing bilateral lung volume reduction surgery using similar selection criteria. Of the 1218 patients enrolled, 511 (42%) conformed to the NETT hypothesis selection criteria and received the randomly assigned surgical or medical treatment (surgical = 261; medical = 250). Lung volume reduction surgery resulted in a 5-year survival benefit (70% vs 60%; P = .02). Results at 3 years compared with baseline data favored surgical reduction in terms of residual volume reduction (25% vs 2%; P < .001), University of California San Diego dyspnea score (16 vs 0 points; P < .001), and improved St George Respiratory Questionnaire quality of life score (12 points vs 0 points; P < .001). For the 513 patients with a homogeneous pattern of emphysema randomized to surgical or medical treatment, lung volume reduction surgery produced no survival advantage and very limited functional benefit. Patients most likely to benefit from lung volume reduction surgery have heterogeneously distributed emphysema involving the upper lung zones predominantly. Such patients in the NETT trial had results nearly identical to those previously reported in a nonrandomized series of similar patients undergoing lung volume reduction surgery. 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.

  1. Use of operating room information system data to predict the impact of reducing turnover times on staffing costs.

    PubMed

    Dexter, Franklin; Abouleish, Amr E; Epstein, Richard H; Whitten, Charles W; Lubarsky, David A

    2003-10-01

    Potential benefits to reducing turnover times are both quantitative (e.g., complete more cases and reduce staffing costs) and qualitative (e.g., improve professional satisfaction). Analyses have shown the quantitative arguments to be unsound except for reducing staffing costs. We describe a methodology by which each surgical suite can use its own numbers to calculate its individual potential reduction in staffing costs from reducing its turnover times. Calculations estimate optimal allocated operating room (OR) time (based on maximizing OR efficiency) before and after reducing the maximum and average turnover times. At four academic tertiary hospitals, reductions in average turnover times of 3 to 9 min would result in 0.8% to 1.8% reductions in staffing cost. Reductions in average turnover times of 10 to 19 min would result in 2.5% to 4.0% reductions in staffing costs. These reductions in staffing cost are achieved predominantly by reducing allocated OR time, not by reducing the hours that staff work late. Heads of anesthesiology groups often serve on OR committees that are fixated on turnover times. Rather than having to argue based on scientific studies, this methodology provides the ability to show the specific quantitative effects (small decreases in staffing costs and allocated OR time) of reducing turnover time using a surgical suite's own data. Many anesthesiologists work at hospitals where surgeons and/or operating room (OR) committees focus repeatedly on turnover time reduction. We developed a methodology by which the reductions in staffing cost as a result of turnover time reduction can be calculated for each facility using its own data. Staffing cost reductions are generally very small and would be achieved predominantly by reducing allocated OR time to the surgeons.

  2. A revised model of ex-vivo reduction of hexavalent chromium in human and rodent gastric juices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlosser, Paul M., E-mail: schlosser.paul@epa.gov; Sasso, Alan F.

    Chronic oral exposure to hexavalent chromium (Cr-VI) in drinking water has been shown to induce tumors in the mouse gastrointestinal (GI) tract and rat oral cavity. The same is not true for trivalent chromium (Cr-III). Thus reduction of Cr-VI to Cr-III in gastric juices is considered a protective mechanism, and it has been suggested that the difference between the rate of reduction among mice, rats, and humans could explain or predict differences in sensitivity to Cr-VI. We evaluated previously published models of gastric reduction and believe that they do not fully describe the data on reduction as a function ofmore » Cr-VI concentration, time, and (in humans) pH. The previous models are parsimonious in assuming only a single reducing agent in rodents and describing pH-dependence using a simple function. We present a revised model that assumes three pools of reducing agents in rats and mice with pH-dependence based on known speciation chemistry. While the revised model uses more fitted parameters than the original model, they are adequately identifiable given the available data, and the fit of the revised model to the full range of data is shown to be significantly improved. Hence the revised model should provide better predictions of Cr-VI reduction when integrated into a corresponding PBPK model. - Highlights: • Hexavalent chromium (Cr-VI) reduction in gastric juices is a key detoxifying step. • pH-dependent Cr-VI reduction rates are explained using known chemical speciation. • Reduction in rodents appears to involve multiple pools of electron donors. • Reduction appears to continue after 60 min, although more slowly than initial rates.« less

  3. Data poverty: A global evaluation for 2009 to 2013 - implications for sustainable development and disaster risk reduction

    NASA Astrophysics Data System (ADS)

    Leidig, Mathias; Teeuw, Richard M.; Gibson, Andrew D.

    2016-08-01

    The article presents a time series (2009-2013) analysis for a new version of the ;Digital Divide; concept that developed in the 1990s. Digital information technologies, such as the Internet, mobile phones and social media, provide vast amounts of data for decision-making and resource management. The Data Poverty Index (DPI) provides an open-source means of annually evaluating global access to data and information. The DPI can be used to monitor aspects of data and information availability at global and national levels, with potential application at local (district) levels. Access to data and information is a major factor in disaster risk reduction, increased resilience to disaster and improved adaptation to climate change. In that context, the DPI could be a useful tool for monitoring the Sustainable Development Goals of the Sendai Framework for Disaster Risk Reduction (2015-2030). The effects of severe data poverty, particularly limited access to geoinformatic data, free software and online training materials, are discussed in the context of sustainable development and disaster risk reduction. Unlike many other indices, the DPI is underpinned by datasets that are consistently provided annually for almost all the countries of the world and can be downloaded without restriction or cost.

  4. Generation and reduction of the data for the Ulysses gravitational wave experiment

    NASA Technical Reports Server (NTRS)

    Agresti, R.; Bonifazi, P.; Iess, L.; Trager, G. B.

    1987-01-01

    A procedure for the generation and reduction of the radiometric data known as REGRES is described. The software is implemented on a HP-1000F computer and was tested on REGRES data relative to the Voyager I spacecraft. The REGRES data are a current output of NASA's Orbit Determination Program. The software package was developed in view of the data analysis of the gravitational wave experiment planned for the European spacecraft Ulysses.

  5. Energy-gap reduction in heavily doped silicon: Causes and consequences

    NASA Astrophysics Data System (ADS)

    Pantelides, Sokrates T.; Selloni, Annabella; Car, Roberto

    1985-02-01

    The authors review briefly the existing theoretical treatments of the various effects that contribute to the reduction of the energy gap in heavily doped Si, namely electron-electron and electron-impurity interactions and the effect of disorder in the impurity distribution. They then turn to the longstanding question why energy-gap reductions extracted from three different types of experiments have persistently produced values with substantial discrepancies, making it impossible to compare with theoretical values. First, they demonstrate that a meaningful comparison between theory and experiment can indeed be made if theoretical calculations are carried out for actual quantities that experiments measure, e.g. luminescence spectra, as recently done by Selloni and Pantelides. Then, they demonstrate that, independent of any theoretical calculations, the optical absorption spectra are fully consistent with the luminescence spectra and that the discrepancies in the energy-gap reductions extracted from the two sets of spectra are caused entirely by the curve-fitting procedures used in analyzing optical-absorption data. Finally, they show explicitly that, as already believed by many authors, energy-gap reductions extracted from electrical measurements on transistors do not correspond to true gap reductions. They identify two corrections that must be added to the values extracted from the electrical data in order to arrive at the true gap reductions and show that the resulting values are in good overall agreement with luminescence and absorption data. They, therefore, demonstrate that the observed reduction in emitter injection efficiency in bipolar transistors is not strictly due to a gap reduction, as generally believed, but to three very different effects.

  6. Comparative Analysis of Haar and Daubechies Wavelet for Hyper Spectral Image Classification

    NASA Astrophysics Data System (ADS)

    Sharif, I.; Khare, S.

    2014-11-01

    With the number of channels in the hundreds instead of in the tens Hyper spectral imagery possesses much richer spectral information than multispectral imagery. The increased dimensionality of such Hyper spectral data provides a challenge to the current technique for analyzing data. Conventional classification methods may not be useful without dimension reduction pre-processing. So dimension reduction has become a significant part of Hyper spectral image processing. This paper presents a comparative analysis of the efficacy of Haar and Daubechies wavelets for dimensionality reduction in achieving image classification. Spectral data reduction using Wavelet Decomposition could be useful because it preserves the distinction among spectral signatures. Daubechies wavelets optimally capture the polynomial trends while Haar wavelet is discontinuous and resembles a step function. The performance of these wavelets are compared in terms of classification accuracy and time complexity. This paper shows that wavelet reduction has more separate classes and yields better or comparable classification accuracy. In the context of the dimensionality reduction algorithm, it is found that the performance of classification of Daubechies wavelets is better as compared to Haar wavelet while Daubechies takes more time compare to Haar wavelet. The experimental results demonstrate the classification system consistently provides over 84% classification accuracy.

  7. Enema reduction of intussusception: the success rate of hydrostatic and pneumatic reduction.

    PubMed

    Khorana, Jiraporn; Singhavejsakul, Jesda; Ukarapol, Nuthapong; Laohapensang, Mongkol; Wakhanrittee, Junsujee; Patumanond, Jayanton

    2015-01-01

    Intussusception is a common surgical emergency in infants and children. The incidence of intussusception is from one to four per 2,000 infants and children. If there is no peritonitis, perforation sign on abdominal radiographic studies, and nonresponsive shock, nonoperative reduction by pneumatic or hydrostatic enema can be performed. The purpose of this study was to compare the success rates of both the methods. Two institutional retrospective cohort studies were performed. All intussusception patients (ICD-10 code K56.1) who had visited Chiang Mai University Hospital and Siriraj Hospital from January 2006 to December 2012 were included in the study. The data were obtained by chart reviews and electronic databases, which included demographic data, symptoms, signs, and investigations. The patients were grouped according to the method of reduction followed into pneumatic reduction and hydrostatic reduction groups with the outcome being the success of the reduction technique. One hundred and seventy episodes of intussusception occurring in the patients of Chiang Mai University Hospital and Siriraj Hospital were included in this study. The success rate of pneumatic reduction was 61% and that of hydrostatic reduction was 44% (P=0.036). Multivariable analysis and adjusting of the factors by propensity scores were performed; the success rate of pneumatic reduction was 1.48 times more than that of hydrostatic reduction (P=0.036, 95% confidence interval [CI] =1.03-2.13). Both pneumatic and hydrostatic reduction can be performed safely according to the experience of the radiologist or pediatric surgeon and hospital setting. This study showed that pneumatic reduction had a higher success rate than hydrostatic reduction.

  8. 48 CFR 52.215-11 - Price Reduction for Defective Certified Cost or Pricing Data-Modifications.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Price Reduction for... CONTRACT CLAUSES Text of Provisions and Clauses 52.215-11 Price Reduction for Defective Certified Cost or Pricing Data—Modifications. As prescribed in 15.408(c), insert the following clause: Price Reduction for...

  9. Airborne ballistic camera tracking systems

    NASA Technical Reports Server (NTRS)

    Redish, W. L.

    1976-01-01

    An operational airborne ballistic camera tracking system was tested for operational and data reduction feasibility. The acquisition and data processing requirements of the system are discussed. Suggestions for future improvements are also noted. A description of the data reduction mathematics is outlined. Results from a successful reentry test mission are tabulated. The test mission indicated that airborne ballistic camera tracking systems are feasible.

  10. 40 CFR Table 5 of Subpart Aaaaaaa... - Applicability of General Provisions to Subpart AAAAAAA

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... must be conducted. § 63.7(e)(2)-(4) Conduct of Performance Tests and Data Reduction Yes. § 63.7(f)-(h) Use of Alternative Test Method; Data Analysis, Recordkeeping, and Reporting; and Waiver of Performance... CMS requirements. § 63.8(e)-(f) CMS Performance Evaluation Yes. § 63.8(g)(1)-(4) Data Reduction...

  11. Data analysis of response interruption and redirection as a treatment for vocal stereotypy.

    PubMed

    Wunderlich, Kara L; Vollmer, Timothy R

    2015-12-01

    Vocal stereotypy, or repetitive, noncontextual vocalizations, is a problematic form of behavior exhibited by many individuals with autism spectrum disorder (ASD). Recent research has evaluated the efficacy of response interruption and redirection (RIRD) in the reduction of vocal stereotypy. Research has indicated that RIRD often results in reductions in the level of vocal stereotypy; however, many previous studies have only presented data on vocal stereotypy that occurred outside RIRD implementation. The current study replicated the procedures of previous studies that have evaluated the efficacy of RIRD and compared 2 data-presentation methods: inclusion of only data collected outside RIRD implementation and inclusion of all vocal stereotypy data from the entirety of each session. Subjects were 7 children who had been diagnosed with ASD. Results indicated that RIRD appeared to be effective when we evaluated the level of vocal stereotypy outside RIRD implementation, but either no reductions or more modest reductions in the level of vocal stereotypy during the entirety of sessions were obtained for all subjects. Results suggest that data-analysis methods used in previous research may overestimate the efficacy of RIRD. © Society for the Experimental Analysis of Behavior.

  12. Operational Data Reduction Procedure for Determining Density and Vertical Structure of the Martian Upper Atmosphere from Mars Global Surveyor Accelerometer Measurements

    NASA Technical Reports Server (NTRS)

    Cancro, George J.; Tolson, Robert H.; Keating, Gerald M.

    1998-01-01

    The success of aerobraking by the Mars Global Surveyor (MGS) spacecraft was partly due to the analysis of MGS accelerometer data. Accelerometer data was used to determine the effect of the atmosphere on each orbit, to characterize the nature of the atmosphere, and to predict the atmosphere for future orbits. To interpret the accelerometer data, a data reduction procedure was developed to produce density estimations utilizing inputs from the spacecraft, the Navigation Team, and pre-mission aerothermodynamic studies. This data reduction procedure was based on the calculation of aerodynamic forces from the accelerometer data by considering acceleration due to gravity gradient, solar pressure, angular motion of the MGS, instrument bias, thruster activity, and a vibration component due to the motion of the damaged solar array. Methods were developed to calculate all of the acceleration components including a 4 degree of freedom dynamics model used to gain a greater understanding of the damaged solar array. The total error inherent to the data reduction procedure was calculated as a function of altitude and density considering contributions from ephemeris errors, errors in force coefficient, and instrument errors due to bias and digitization. Comparing the results from this procedure to the data of other MGS Teams has demonstrated that this procedure can quickly and accurately describe the density and vertical structure of the Martian upper atmosphere.

  13. Development of a residual acceleration data reduction and dissemination plan

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.

    1992-01-01

    A major obstacle in evaluating the residual acceleration environment in an orbiting space laboratory is the amount of data collected during a given mission: gigabytes of data will be available as SAMS units begin to fly regularly. Investigators taking advantage of the reduced gravity conditions of space should not be overwhelmed by the accelerometer data which describe these conditions. We are therefore developing a data reduction and analysis plan that will allow principal investigators of low-g experiments to create experiment-specific residual acceleration data bases for post-flight analysis. The basic aspects of the plan can also be used to characterize the acceleration environment of earth orbiting laboratories. Our development of the reduction plan is based on the following program of research: the identification of experiment sensitivities by order of magnitude estimates and numerical modelling; evaluation of various signal processing techniques appropriate for the reduction, supplementation, and dissemination of residual acceleration data; and testing and implementation of the plan on existing acceleration data bases. The orientation of the residual acceleration vector with respect to some set of coordinate axes is important for experiments with known directional sensitivity. Orientation information can be obtained from the evaluation of direction cosines. Fourier analysis is commonly used to transform time history data into the frequency domain. Common spectral representations are the amplitude spectrum which gives the average of the components of the time series at each frequency and the power spectral density which indicates the power or energy present in the series per unit frequency interval. The data reduction and analysis scheme developed involves a two tiered structure to: (1) identify experiment characteristics and mission events that can be used to limit the amount of accelerator data an investigator should be interested in; and (2) process the data in a way that will be meaningful to the experiment objectives. A general outline of the plan is given.

  14. Use of simulation tools to illustrate the effect of data management practices for low and negative plate counts on the estimated parameters of microbial reduction models.

    PubMed

    Garcés-Vega, Francisco; Marks, Bradley P

    2014-08-01

    In the last 20 years, the use of microbial reduction models has expanded significantly, including inactivation (linear and nonlinear), survival, and transfer models. However, a major constraint for model development is the impossibility to directly quantify the number of viable microorganisms below the limit of detection (LOD) for a given study. Different approaches have been used to manage this challenge, including ignoring negative plate counts, using statistical estimations, or applying data transformations. Our objective was to illustrate and quantify the effect of negative plate count data management approaches on parameter estimation for microbial reduction models. Because it is impossible to obtain accurate plate counts below the LOD, we performed simulated experiments to generate synthetic data for both log-linear and Weibull-type microbial reductions. We then applied five different, previously reported data management practices and fit log-linear and Weibull models to the resulting data. The results indicated a significant effect (α = 0.05) of the data management practices on the estimated model parameters and performance indicators. For example, when the negative plate counts were replaced by the LOD for log-linear data sets, the slope of the subsequent log-linear model was, on average, 22% smaller than for the original data, the resulting model underpredicted lethality by up to 2.0 log, and the Weibull model was erroneously selected as the most likely correct model for those data. The results demonstrate that it is important to explicitly report LODs and related data management protocols, which can significantly affect model results, interpretation, and utility. Ultimately, we recommend using only the positive plate counts to estimate model parameters for microbial reduction curves and avoiding any data value substitutions or transformations when managing negative plate counts to yield the most accurate model parameters.

  15. 40 CFR Table 2 to Subpart Hhhh of... - Applicability of General Provisions (40 CFR Part 63, Subpart A) to Subpart HHHH

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Alternative Test Method Yes EPA retains approval authority § 63.7(g) Data Analysis Yes § 63.7(h) Waiver of... monitoring systems (CEMS) § 63.8(g)(1) Data Reduction Yes § 63.8(g)(2) Data Reduction No Subpart HHHH does not require the use of CEMS or continuous opacity monitoring systems (COMS). § 63.8(g)(3)-(5) Data...

  16. Floating Potential Probe Langmuir Probe Data Reduction Results

    NASA Technical Reports Server (NTRS)

    Morton, Thomas L.; Minow, Joseph I.

    2002-01-01

    During its first five months of operations, the Langmuir Probe on the Floating Potential Probe (FPP) obtained data on ionospheric electron densities and temperatures in the ISS orbit. In this paper, the algorithms for data reduction are presented, and comparisons are made of FPP data with ground-based ionosonde and Incoherent Scattering Radar (ISR) results. Implications for ISS operations are detailed, and the need for a permanent FPP on ISS is examined.

  17. WASTE REDUCTION OF TECHNOLOGY EVALUATIONS OF THE U.S. EPA WRITE PROGRAM

    EPA Science Inventory

    The Waste Reduction Innovative Technology Evaluation (WRITE)Program was established in 1989 to provide objective, accurate performance and cost data about waste reducing technologies for a variety of industrial and commercial application. EPA's Risk Reduction Engineering Laborato...

  18. Solvepol: A Reduction Pipeline for Imaging Polarimetry Data

    NASA Astrophysics Data System (ADS)

    Ramírez, Edgar A.; Magalhães, Antônio M.; Davidson, James W., Jr.; Pereyra, Antonio; Rubinho, Marcelo

    2017-05-01

    We present a newly, fully automated, data pipeline, Solvepol, designed to reduce and analyze polarimetric data. It has been optimized for imaging data from the Instituto de Astronomía, Geofísica e Ciências Atmosféricas (IAG) of the University of São Paulo (USP), calcite Savart prism plate-based IAGPOL polarimeter. Solvepol is also the basis of a reduction pipeline for the wide-field optical polarimeter that will execute SOUTH POL, a survey of the polarized southern sky. Solvepol was written using the Interactive data language (IDL) and is based on the Image Reduction and Analysis Facility (IRAF) task PCCDPACK, developed by our polarimetry group. We present and discuss reduced data from standard stars and other fields and compare these results with those obtained in the IRAF environment. Our analysis shows that Solvepol, in addition to being a fully automated pipeline, produces results consistent with those reduced by PCCDPACK and reported in the literature.

  19. Clustering and Dimensionality Reduction to Discover Interesting Patterns in Binary Data

    NASA Astrophysics Data System (ADS)

    Palumbo, Francesco; D'Enza, Alfonso Iodice

    The attention towards binary data coding increased consistently in the last decade due to several reasons. The analysis of binary data characterizes several fields of application, such as market basket analysis, DNA microarray data, image mining, text mining and web-clickstream mining. The paper illustrates two different approaches exploiting a profitable combination of clustering and dimensionality reduction for the identification of non-trivial association structures in binary data. An application in the Association Rules framework supports the theory with the empirical evidence.

  20. DKIST visible broadband imager data processing pipeline

    NASA Astrophysics Data System (ADS)

    Beard, Andrew; Cowan, Bruce; Ferayorni, Andrew

    2014-07-01

    The Daniel K. Inouye Solar Telescope (DKIST) Data Handling System (DHS) provides the technical framework and building blocks for developing on-summit instrument quality assurance and data reduction pipelines. The DKIST Visible Broadband Imager (VBI) is a first light instrument that alone will create two data streams with a bandwidth of 960 MB/s each. The high data rate and data volume of the VBI require near-real time processing capability for quality assurance and data reduction, and will be performed on-summit using Graphics Processing Unit (GPU) technology. The VBI data processing pipeline (DPP) is the first designed and developed using the DKIST DHS components, and therefore provides insight into the strengths and weaknesses of the framework. In this paper we lay out the design of the VBI DPP, examine how the underlying DKIST DHS components are utilized, and discuss how integration of the DHS framework with GPUs was accomplished. We present our results of the VBI DPP alpha release implementation of the calibration, frame selection reduction, and quality assurance display processing nodes.

  1. Modelling CEC variations versus structural iron reduction levels in dioctahedral smectites. Existing approaches, new data and model refinements.

    PubMed

    Hadi, Jebril; Tournassat, Christophe; Ignatiadis, Ioannis; Greneche, Jean Marc; Charlet, Laurent

    2013-10-01

    A model was developed to describe how the 2:1 layer excess negative charge induced by the reduction of Fe(III) to Fe(II) by sodium dithionite buffered with citrate-bicarbonate is balanced and applied to nontronites. This model is based on new experimental data and extends structural interpretation introduced by a former model [36-38]. The 2:1 layer negative charge increase due to Fe(III) to Fe(II) reduction is balanced by an excess adsorption of cations in the clay interlayers and a specific sorption of H(+) from solution. Prevalence of one compensating mechanism over the other is related to the growing lattice distortion induced by structural Fe(III) reduction. At low reduction levels, cation adsorption dominates and some of the incorporated protons react with structural OH groups, leading to a dehydroxylation of the structure. Starting from a moderate reduction level, other structural changes occur, leading to a reorganisation of the octahedral and tetrahedral lattice: migration or release of cations, intense dehydroxylation and bonding of protons to undersaturated oxygen atoms. Experimental data highlight some particular properties of ferruginous smectites regarding chemical reduction. Contrary to previous assumptions, the negative layer charge of nontronites does not only increase towards a plateau value upon reduction. A peak is observed in the reduction domain. After this peak, the negative layer charge decreases upon extended reduction (>30%). The decrease is so dramatic that the layer charge of highly reduced nontronites can fall below that of its fully oxidised counterpart. Furthermore, the presence of a large amount of tetrahedral Fe seems to promote intense clay structural changes and Fe reducibility. Our newly acquired data clearly show that models currently available in the literature cannot be applied to the whole reduction range of clay structural Fe. Moreover, changes in the model normalising procedure clearly demonstrate that the investigated low tetrahedral bearing nontronites (SWa-1, GAN and NAu-1) all exhibit the same behaviour at low reduction levels. Consequently, we restricted our model to the case of moderate reduction (<30%) in low tetrahedral Fe-bearing nontronites. Our adapted model provides the relative amounts of Na(+) (p) and H(+) (ni) cations incorporated in the structure as a function of the amount of Fe reduction. Two equations enable the investigated systems to be described: p=m/(1+Kr·ω·mrel) and ni=Kr·ω·m·mrel/(1+Kr·ω·mrel); where m is the Fe(II) content, mrel, the reduction level (m/mtot), ω, the cation exchange capacity (CEC, and Kr, an empirical constant specific to the system. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. 40 CFR Table 2 to Subpart Pppp of... - Applicability of General Provisions to Subpart PPPP of Part 63

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... of Plastic Parts and Products Pt. 63, Subpt. PPPP, Table 2 Table 2 to Subpart PPPP of Part 63...) Data Reduction No Sections 63.4567 and 63.4568 specify monitoring data reduction. § 63.9(a)-(d...

  3. PCA based feature reduction to improve the accuracy of decision tree c4.5 classification

    NASA Astrophysics Data System (ADS)

    Nasution, M. Z. F.; Sitompul, O. S.; Ramli, M.

    2018-03-01

    Splitting attribute is a major process in Decision Tree C4.5 classification. However, this process does not give a significant impact on the establishment of the decision tree in terms of removing irrelevant features. It is a major problem in decision tree classification process called over-fitting resulting from noisy data and irrelevant features. In turns, over-fitting creates misclassification and data imbalance. Many algorithms have been proposed to overcome misclassification and overfitting on classifications Decision Tree C4.5. Feature reduction is one of important issues in classification model which is intended to remove irrelevant data in order to improve accuracy. The feature reduction framework is used to simplify high dimensional data to low dimensional data with non-correlated attributes. In this research, we proposed a framework for selecting relevant and non-correlated feature subsets. We consider principal component analysis (PCA) for feature reduction to perform non-correlated feature selection and Decision Tree C4.5 algorithm for the classification. From the experiments conducted using available data sets from UCI Cervical cancer data set repository with 858 instances and 36 attributes, we evaluated the performance of our framework based on accuracy, specificity and precision. Experimental results show that our proposed framework is robust to enhance classification accuracy with 90.70% accuracy rates.

  4. Calibrating a tensor magnetic gradiometer using spin data

    USGS Publications Warehouse

    Bracken, Robert E.; Smith, David V.; Brown, Philip J.

    2005-01-01

    Scalar magnetic data are often acquired to discern characteristics of geologic source materials and buried objects. It is evident that a great deal can be done with scalar data, but there are significant advantages to direct measurement of the magnetic gradient tensor in applications with nearby sources, such as unexploded ordnance (UXO). To explore these advantages, we adapted a prototype tensor magnetic gradiometer system (TMGS) and successfully implemented a data-reduction procedure. One of several critical reduction issues is the precise determination of a large group of calibration coefficients for the sensors and sensor array. To resolve these coefficients, we devised a spin calibration method, after similar methods of calibrating space-based magnetometers (Snare, 2001). The spin calibration procedure consists of three parts: (1) collecting data by slowly revolving the sensor array in the Earth?s magnetic field, (2) deriving a comprehensive set of coefficients from the spin data, and (3) applying the coefficients to the survey data. To show that the TMGS functions as a tensor gradiometer, we conducted an experimental survey that verified that the reduction procedure was effective (Bracken and Brown, in press). Therefore, because it was an integral part of the reduction, it can be concluded that the spin calibration was correctly formulated with acceptably small errors.

  5. Relationship between mass-flux reduction and source-zone mass removal: analysis of field data.

    PubMed

    Difilippo, Erica L; Brusseau, Mark L

    2008-05-26

    The magnitude of contaminant mass-flux reduction associated with a specific amount of contaminant mass removed is a key consideration for evaluating the effectiveness of a source-zone remediation effort. Thus, there is great interest in characterizing, estimating, and predicting relationships between mass-flux reduction and mass removal. Published data collected for several field studies were examined to evaluate relationships between mass-flux reduction and source-zone mass removal. The studies analyzed herein represent a variety of source-zone architectures, immiscible-liquid compositions, and implemented remediation technologies. There are two general approaches to characterizing the mass-flux-reduction/mass-removal relationship, end-point analysis and time-continuous analysis. End-point analysis, based on comparing masses and mass fluxes measured before and after a source-zone remediation effort, was conducted for 21 remediation projects. Mass removals were greater than 60% for all but three of the studies. Mass-flux reductions ranging from slightly less than to slightly greater than one-to-one were observed for the majority of the sites. However, these single-snapshot characterizations are limited in that the antecedent behavior is indeterminate. Time-continuous analysis, based on continuous monitoring of mass removal and mass flux, was performed for two sites, both for which data were obtained under water-flushing conditions. The reductions in mass flux were significantly different for the two sites (90% vs. approximately 8%) for similar mass removals ( approximately 40%). These results illustrate the dependence of the mass-flux-reduction/mass-removal relationship on source-zone architecture and associated mass-transfer processes. Minimal mass-flux reduction was observed for a system wherein mass removal was relatively efficient (ideal mass-transfer and displacement). Conversely, a significant degree of mass-flux reduction was observed for a site wherein mass removal was inefficient (non-ideal mass-transfer and displacement). The mass-flux-reduction/mass-removal relationship for the latter site exhibited a multi-step behavior, which cannot be predicted using some of the available simple estimation functions.

  6. Evaluation of carbon emission reductions promoted by private driving restrictions based on automatic fare collection data in Beijing, China.

    PubMed

    Zhang, Wandi; Chen, Feng; Wang, Zijia; Huang, Jianling; Wang, Bo

    2017-11-01

    Public transportation automatic fare collection (AFC) systems are able to continuously record large amounts of passenger travel information, providing massive, low-cost data for research on regulations pertaining to public transport. These data can be used not only to analyze characteristics of passengers' trips but also to evaluate transport policies that promote a travel mode shift and emission reduction. In this study, models combining card, survey, and geographic information systems (GIS) data are established with a research focus on the private driving restriction policies being implemented in an ever-increasing number of cities. The study aims to evaluate the impact of these policies on the travel mode shift, as well as relevant carbon emission reductions. The private driving restriction policy implemented in Beijing is taken as an example. The impact of the restriction policy on the travel mode shift from cars to subways is analyzed through a model based on metro AFC data. The routing paths of these passengers are also analyzed based on the GIS method and on survey data, while associated carbon emission reductions are estimated. The analysis method used in this study can provide reference for the application of big data in evaluating transport policies. Motor vehicles have become the most prevalent source of emissions and subsequently air pollution within Chinese cities. The evaluation of the effects of driving restriction policies on the travel mode shift and vehicle emissions will be useful for other cities in the future. Transport big data, playing an important support role in estimating the travel mode shift and emission reduction considered, can help related departments to estimate the effects of traffic jam alleviation and environment improvement before the implementation of these restriction policies and provide a reference for relevant decisions.

  7. 47 CFR 90.309 - Tables and figures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... Maps with a scale of 1:250,000 or larger (such as 1:24,000) shall be used. Digital Terrain Data Tapes... sec. 90.307(d). Table “F”—Decibel Reduction/Power Equivalents dB reduction below 1 kW ERP permitted... curve draw a horizontal line to the power reduction scale. (6) The power reduction in dB determines the...

  8. Composite Material Testing Data Reduction to Adjust for the Systematic 6-DOF Testing Machine Aberrations

    Treesearch

    Athanasios lliopoulos; John G. Michopoulos; John G. C. Hermanson

    2012-01-01

    This paper describes a data reduction methodology for eliminating the systematic aberrations introduced by the unwanted behavior of a multiaxial testing machine, into the massive amounts of experimental data collected from testing of composite material coupons. The machine in reference is a custom made 6-DoF system called NRL66.3 and developed at the NAval...

  9. 40 CFR Table 7 of Subpart Yyyy of... - Applicability of General Provisions to Subpart YYYY

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... provisions Yes § 63.7(g) Performance test data analysis, recordkeeping, and reporting Yes § 63.7(h) Waiver of... conducting performance tests Yes § 63.7(e)(2) Conduct of performance tests and reduction of data Yes Subpart... Yes § 63.8(g) Data reduction Yes Except that provisions for COMS are not applicable. Averaging periods...

  10. Data Reduction Functions for the Langley 14- by 22-Foot Subsonic Tunnel

    NASA Technical Reports Server (NTRS)

    Boney, Andy D.

    2014-01-01

    The Langley 14- by 22-Foot Subsonic Tunnel's data reduction software utilizes six major functions to compute the acquired data. These functions calculate engineering units, tunnel parameters, flowmeters, jet exhaust measurements, balance loads/model attitudes, and model /wall pressures. The input (required) variables, the output (computed) variables, and the equations and/or subfunction(s) associated with each major function are discussed.

  11. Experimental study of noise reduction for an unstiffened cylindrical model of an airplane fuselage

    NASA Astrophysics Data System (ADS)

    Willis, C. M.; Daniels, E. F.

    1981-12-01

    Noise reduction measurements were made for a simplified model of an airplane fuselage consisting of an unstiffened aluminum cylinder 0.5 m in diameter by 1.2 m long with a 1.6-mm-thick wall. Noise reduction was first measured with a reverberant field pink-noise load on the cylinder exterior. Next, noise reduction was measured by using a propeller to provide a more realistic noise load on the cylinder. Structural resonance frequencies and acoustic reverberation times for the cylinder interior volume were also measured. Comparison of data from the relatively simple test using reverberant-field noise with data from the more complex propeller-noise tests indicates some similarity in both the overall noise reduction and the spectral distribution. However, all of the test parameters investigated (propeller speed, blade pitch, and tip clearance) had some effect on the noise-reduction spectra. Thus, the amount of noise reduction achieved appears to be somewhat dependent upon the spectral and spatial characteristics of the flight conditions. Information is also presented on cyclinder resonance frequencies, damping, and characteristics of propeller-noise loads.

  12. Computer program developed for flowsheet calculations and process data reduction

    NASA Technical Reports Server (NTRS)

    Alfredson, P. G.; Anastasia, L. J.; Knudsen, I. E.; Koppel, L. B.; Vogel, G. J.

    1969-01-01

    Computer program PACER-65, is used for flowsheet calculations and easily adapted to process data reduction. Each unit, vessel, meter, and processing operation in the overall flowsheet is represented by a separate subroutine, which the program calls in the order required to complete an overall flowsheet calculation.

  13. Reduction of spectra exposed by the 700mm CCD camera of the Ondřejov telescope coudé spectrograph

    NASA Astrophysics Data System (ADS)

    Skoda, Petr; Slechta, Miroslav

    We present a brief cook-book for the reduction of spectra exposed by the Ondřejov 2-meter telescope coudé spectrograph. For the data reduction, we use standard IRAF packages running on Solaris and Linux. The sequence of commands is given for the typical reduction session together with short explanation and detailed list of parameter settings. The reduction progress is illustrated by example plots.

  14. Magnitudes of biomarker reductions in response to controlled reductions in cigarettes smoked per day: a one-week clinical confinement study.

    PubMed

    Theophilus, Eugenia H; Coggins, Christopher R E; Chen, Peter; Schmidt, Eckhardt; Borgerding, Michael F

    2015-03-01

    Tobacco toxicant-related exposure reduction is an important tool in harm reduction. Cigarette per day reduction (CPDR) occurs as smokers migrate from smoking cigarettes to using alternative tobacco/nicotine products, or quit smoking. Few reports characterize the dose-response relationships between CPDR and effects on exposure biomarkers, especially at the low end of CPD exposure (e.g., 5 CPD). We present data on CPDR by characterizing magnitudes of biomarker reductions. We present data from a well-controlled, one-week clinical confinement study in healthy smokers who were switched from smoking 19-25 CPD to smoking 20, 10, 5 or 0 CPD. Biomarkers were measured in blood, plasma, urine, and breath, and included smoke-related toxicants, urine mutagenicity, smoked cigarette filter analyses (mouth level exposure), and vital signs. Many of the biomarkers (e.g., plasma nicotine) showed strong CPDR dose-response reductions, while others (e.g., plasma thiocyanate) showed weaker dose-response reductions. Factors that lead to lower biomarker reductions include non-CPD related contributors to the measured response (e.g., other exposure sources from environment, life style, occupation; inter-individual variability). This study confirms CPDR dose-responsive biomarkers and suggests that a one-week design is appropriate for characterizing exposure reductions when smokers switch from cigarettes to new tobacco products. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Assimilation of IASI and AIRS Data: Information Content and Quality Control

    NASA Technical Reports Server (NTRS)

    Joiner, J.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The Infrared Atmospheric Sounding Interferometer (IASI) and Atmospheric Infrared Sounder (AIRS) instruments have two orders of magnitude more channels that the current operational infrared sounder (High Resolution Infra-Red Sounder (HIRS)). This data volume presents a technological challenge for using the data in a data assimilation system. Data reduction will be a necessary for assimilation. It is important to understand the information content of the radiance measurements for data reduction purposes. In this talk, I will discuss issues relating to information content and quality control for assimilation of the AIRS and IASI data.

  16. HI data reduction for the Arecibo Pisces-Perseus Supercluster Survey

    NASA Astrophysics Data System (ADS)

    Davis, Cory; Johnson, Cory; Craig, David W.; Haynes, Martha P.; Jones, Michael G.; Koopmann, Rebecca A.; Hallenbeck, Gregory L.; Undergraduate ALFALFA Team

    2017-01-01

    The Undergraduate ALFALFA team is currently focusing on the analysis of the Pisces-Perseus Supercluster to test current supercluster formation models. The primary goal of our research is to reduce L-band HI data from the Arecibo telescope. To reduce the data we use IDL programs written by our collaborators to reduce the data and find potential sources whose mass can be estimated by the baryonic Tully-Fisher relation, which relates the luminosity to the rotational velocity profile of spiral galaxies. Thus far we have reduced data and estimated HI masses for several galaxies in the supercluster region.We will give examples of data reduction and preliminary results for both the fall 2015 and 2016 observing seasons. We will also describe the data reduction process and the process of learning the associated software, and the use of virtual observatory tools such as the SDSS databases, Aladin, TOPCAT and others.This research was supported by the NSF grant AST-1211005.

  17. SELECTIVE CATALYTIC REDUCTION MERCURY FIELD SAMPLING PROJECT

    EPA Science Inventory

    A lack of data still exists as to the effect of selective catalytic reduction (SCR), selective noncatalytic reduction (SNCR), and flue gas conditioning on the speciation and removal of mercury (Hg) at power plants. This project investigates the impact that SCR, SNCR, and flue gas...

  18. Evaluation of nonpoint-source contamination, Wisconsin: water year 1999

    USGS Publications Warehouse

    Walker, John F.; Graczyk, D.J.; Corsi, Steven R.; Wierl, J.A.; Owens, D.W.

    2001-01-01

    For two of the eight rural streams (Rattlesnake and Kuenster Creeks) minimal BMP implementation has occurred, hence a comparison of pre- BMP and data collected after BMP implementation began is not warranted. For two other rural streams (Brewery and Garfoot Creeks), BMP implementation is complete. For the four remaining rural streams (Bower, Otter, Eagle, and Joos Valley Creeks), the pre-BMP load data were compared to the transitional data to determine if significant reductions in the loads have occurred as a result of the BMP implementation to date. For all sites, the actual constituent loads for suspended solids and total phosphorus exhibit no statistically significant reductions after BMP installation. Multiple regressions were used to remove some of the natural variability in the data. Based on the residual analysis, for Otter Creek, there is a significant difference in the suspended-solids regression residuals between the pre-BMP and transitional periods, indicating a potential reduction as a result of the BMP implementation after accounting for natural variability. For Joos Valley Creek, the residuals for suspended solids and total phosphorus both show a significant reduction after accounting for natural variability. It is possible that the other sites will also show statistically significant reductions in suspended solids and total phosphorus if additional BMPs are implemented.

  19. A data reduction technique and associated computer program for obtaining vehicle attitudes with a single onboard camera

    NASA Technical Reports Server (NTRS)

    Bendura, R. J.; Renfroe, P. G.

    1974-01-01

    A detailed discussion of the application of a previously method to determine vehicle flight attitude using a single camera onboard the vehicle is presented with emphasis on the digital computer program format and data reduction techniques. Application requirements include film and earth-related coordinates of at least two landmarks (or features), location of the flight vehicle with respect to the earth, and camera characteristics. Included in this report are a detailed discussion of the program input and output format, a computer program listing, a discussion of modifications made to the initial method, a step-by-step basic data reduction procedure, and several example applications. The computer program is written in FORTRAN 4 language for the Control Data 6000 series digital computer.

  20. An interprovincial cooperative game model for air pollution control in China.

    PubMed

    Xue, Jian; Zhao, Laijun; Fan, Longzhen; Qian, Ying

    2015-07-01

    The noncooperative air pollution reduction model (NCRM) that is currently adopted in China to manage air pollution reduction of each individual province has inherent drawbacks. In this paper, we propose a cooperative air pollution reduction game model (CRM) that consists of two parts: (1) an optimization model that calculates the optimal pollution reduction quantity for each participating province to meet the joint pollution reduction goal; and (2) a model that distribute the economic benefit of the cooperation (i.e., pollution reduction cost saving) among the provinces in the cooperation based on the Shapley value method. We applied the CRM to the case of SO2 reduction in the Beijing-Tianjin-Hebei region in China. The results, based on the data from 2003-2009, show that cooperation helps lower the overall SO2 pollution reduction cost from 4.58% to 11.29%. Distributed across the participating provinces, such a cost saving from interprovincial cooperation brings significant benefits to each local government and stimulates them for further cooperation in pollution reduction. Finally, sensitivity analysis is performed using the year 2009 data to test the parameters' effects on the pollution reduction cost savings. China is increasingly facing unprecedented pressure for immediate air pollution control. The current air pollution reduction policy does not allow cooperation and is less efficient. In this paper we developed a cooperative air pollution reduction game model that consists of two parts: (1) an optimization model that calculates the optimal pollution reduction quantity for each participating province to meet the joint pollution reduction goal; and (2) a model that distributes the cooperation gains (i.e., cost reduction) among the provinces in the cooperation based on the Shapley value method. The empirical case shows that such a model can help improve efficiency in air pollution reduction. The result of the model can serve as a reference for Chinese government pollution reduction policy design.

  1. Differential maneuvering simulator data reduction and analysis software

    NASA Technical Reports Server (NTRS)

    Beasley, G. P.; Sigman, R. S.

    1972-01-01

    A multielement data reduction and analysis software package has been developed for use with the Langley differential maneuvering simulator (DMS). This package, which has several independent elements, was developed to support all phases of DMS aircraft simulation studies with a variety of both graphical and tabular information. The overall software package is considered unique because of the number, diversity, and sophistication of the element programs available for use in a single study. The purpose of this paper is to discuss the overall DMS data reduction and analysis package by reviewing the development of the various elements of the software, showing typical results that can be obtained, and discussing how each element can be used.

  2. A data reduction package for multiple object spectroscopy

    NASA Technical Reports Server (NTRS)

    Hill, J. M.; Eisenhamer, J. D.; Silva, D. R.

    1986-01-01

    Experience with fiber-optic spectrometers has demonstrated improvements in observing efficiency for clusters of 30 or more objects that must in turn be matched by data reduction capability increases. The Medusa Automatic Reduction System reduces data generated by multiobject spectrometers in the form of two-dimensional images containing 44 to 66 individual spectra, using both software and hardware improvements to efficiently extract the one-dimensional spectra. Attention is given to the ridge-finding algorithm for automatic location of the spectra in the CCD frame. A simultaneous extraction of calibration frames allows an automatic wavelength calibration routine to determine dispersion curves, and both line measurements and cross-correlation techniques are used to determine galaxy redshifts.

  3. Micro-Arcsec mission: implications of the monitoring, diagnostic and calibration of the instrument response in the data reduction chain. .

    NASA Astrophysics Data System (ADS)

    Busonero, D.; Gai, M.

    The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.

  4. Evaluation of SSME test data reduction methods

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1994-01-01

    Accurate prediction of hardware and flow characteristics within the Space Shuttle Main Engine (SSME) during transient and main-stage operation requires a significant integration of ground test data, flight experience, and computational models. The process of integrating SSME test measurements with physical model predictions is commonly referred to as data reduction. Uncertainties within both test measurements and simplified models of the SSME flow environment compound the data integration problem. The first objective of this effort was to establish an acceptability criterion for data reduction solutions. The second objective of this effort was to investigate the data reduction potential of the ROCETS (Rocket Engine Transient Simulation) simulation platform. A simplified ROCETS model of the SSME was obtained from the MSFC Performance Analysis Branch . This model was examined and tested for physical consistency. Two modules were constructed and added to the ROCETS library to independently check the mass and energy balances of selected engine subsystems including the low pressure fuel turbopump, the high pressure fuel turbopump, the low pressure oxidizer turbopump, the high pressure oxidizer turbopump, the fuel preburner, the oxidizer preburner, the main combustion chamber coolant circuit, and the nozzle coolant circuit. A sensitivity study was then conducted to determine the individual influences of forty-two hardware characteristics on fourteen high pressure region prediction variables as returned by the SSME ROCETS model.

  5. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  6. Medication errors in paediatric care: a systematic review of epidemiology and an evaluation of evidence supporting reduction strategy recommendations

    PubMed Central

    Miller, Marlene R; Robinson, Karen A; Lubomski, Lisa H; Rinke, Michael L; Pronovost, Peter J

    2007-01-01

    Background Although children are at the greatest risk for medication errors, little is known about the overall epidemiology of these errors, where the gaps are in our knowledge, and to what extent national medication error reduction strategies focus on children. Objective To synthesise peer reviewed knowledge on children's medication errors and on recommendations to improve paediatric medication safety by a systematic literature review. Data sources PubMed, Embase and Cinahl from 1 January 2000 to 30 April 2005, and 11 national entities that have disseminated recommendations to improve medication safety. Study selection Inclusion criteria were peer reviewed original data in English language. Studies that did not separately report paediatric data were excluded. Data extraction Two reviewers screened articles for eligibility and for data extraction, and screened all national medication error reduction strategies for relevance to children. Data synthesis From 358 articles identified, 31 were included for data extraction. The definition of medication error was non‐uniform across the studies. Dispensing and administering errors were the most poorly and non‐uniformly evaluated. Overall, the distributional epidemiological estimates of the relative percentages of paediatric error types were: prescribing 3–37%, dispensing 5–58%, administering 72–75%, and documentation 17–21%. 26 unique recommendations for strategies to reduce medication errors were identified; none were based on paediatric evidence. Conclusions Medication errors occur across the entire spectrum of prescribing, dispensing, and administering, are common, and have a myriad of non‐evidence based potential reduction strategies. Further research in this area needs a firmer standardisation for items such as dose ranges and definitions of medication errors, broader scope beyond inpatient prescribing errors, and prioritisation of implementation of medication error reduction strategies. PMID:17403758

  7. Langley 14- by 22-foot subsonic tunnel test engineer's data acquisition and reduction manual

    NASA Technical Reports Server (NTRS)

    Quinto, P. Frank; Orie, Nettie M.

    1994-01-01

    The Langley 14- by 22-Foot Subsonic Tunnel is used to test a large variety of aircraft and nonaircraft models. To support these investigations, a data acquisition system has been developed that has both static and dynamic capabilities. The static data acquisition and reduction system is described; the hardware and software of this system are explained. The theory and equations used to reduce the data obtained in the wind tunnel are presented; the computer code is not included.

  8. Crossed hot-wire data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Westphal, R. V.; Mehta, R. D.

    1984-01-01

    The report describes a system for rapid computerized calibration acquisition, and processing of data from a crossed hot-wire anemometer is described. Advantages of the system are its speed, minimal use of analog electronics, and improved accuracy of the resulting data. Two components of mean velocity and turbulence statistics up to third order are provided by the data reduction. Details of the hardware, calibration procedures, response equations, software, and sample results from measurements in a turbulent plane mixing layer are presented.

  9. The SCUBA-2 Data Reduction Cookbook

    NASA Astrophysics Data System (ADS)

    Thomas, Holly S.; Currie, Malcolm J.

    This cookbook provides a short introduction to Starlink facilities, especially SMURF, the Sub-Millimetre User Reduction Facility, for reducing, displaying, and calibrating SCUBA-2 data. It describes some of the data artefacts present in SCUBA-2 time-series and methods to mitigate them. In particular, this cookbook illustrates the various steps required to reduce the data; and gives an overview of the Dynamic Iterative Map-Maker, which carries out all of these steps using a single command controlled by a configuration file. Specialised configuration files are presented.

  10. Investigation of Structure-Property Relationships in Systematic Series of Novel Polymers. [low frequency thermomechanical spectrometry of polymeric materials - computerized torsional braid experiments

    NASA Technical Reports Server (NTRS)

    Gillham, J. K.

    1974-01-01

    The results are discussed of the on-line interface of the Torsional Braid Analysis experiment to an Hierarchical Computer System for data acquisition, data reduction and control of experimental variables. Some experimental results are demonstrated and the data reduction procedures are outlined. Several modes of presentation of the final computer-reduced data are discussed in an attempt to elucidate possible interrelations between the thermal variation of the rigidity and loss parameters.

  11. 40 CFR Table 5 to Subpart Uuuu of... - Continuous Compliance With Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... material balance that includes the pertinent data used to determine the percent reduction of total sulfide... material balance; and (3) complying with the continuous compliance requirements for closed-vent systems. 2... material balance that includes the pertinent data used to determine the percent reduction of toluene...

  12. 40 CFR Table 5 to Subpart Uuuu of... - Continuous Compliance With Emission Limits and Work Practice Standards

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... material balance that includes the pertinent data used to determine the percent reduction of total sulfide... material balance; and (3) complying with the continuous compliance requirements for closed-vent systems. 2... material balance that includes the pertinent data used to determine the percent reduction of toluene...

  13. Presentation of Evidence in Continuing Medical Education Programs: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Allen, Michael; MacLeod, Tanya; Handfield-Jones, Richard; Sinclair, Douglas; Fleming, Michael

    2010-01-01

    Introduction: Clinical trial data can be presented in ways that exaggerate treatment effectiveness. Physicians consider therapy more effective, and may be more likely to make inappropriate practice changes, when data are presented in relative terms such as relative risk reduction rather than in absolute terms such as absolute risk reduction and…

  14. 42 CFR 495.316 - State monitoring and reporting regarding activities required to receive an incentive payment.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... quality measures data; and (v) A description and quantitative data on how its incentive payment program... conditions to use for quality improvement, reduction of disparities, research or outreach. (B) Capability to... specific conditions to use for quality improvement, reduction of disparities, research, or outreach. (B...

  15. 42 CFR 495.316 - State monitoring and reporting regarding activities required to receive an incentive payment.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... quality measures data; and (v) A description and quantitative data on how its incentive payment program... conditions to use for quality improvement, reduction of disparities, research or outreach. (B) Capability to... specific conditions to use for quality improvement, reduction of disparities, research, or outreach. (B...

  16. 42 CFR 495.316 - State monitoring and reporting regarding activities required to receive an incentive payment.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... quality measures data; and (v) A description and quantitative data on how its incentive payment program... conditions to use for quality improvement, reduction of disparities, research or outreach. (B) Capability to... specific conditions to use for quality improvement, reduction of disparities, research, or outreach. (B...

  17. Dimension Reduction With Extreme Learning Machine.

    PubMed

    Kasun, Liyanaarachchi Lekamalage Chamara; Yang, Yan; Huang, Guang-Bin; Zhang, Zhengyou

    2016-08-01

    Data may often contain noise or irrelevant information, which negatively affect the generalization capability of machine learning algorithms. The objective of dimension reduction algorithms, such as principal component analysis (PCA), non-negative matrix factorization (NMF), random projection (RP), and auto-encoder (AE), is to reduce the noise or irrelevant information of the data. The features of PCA (eigenvectors) and linear AE are not able to represent data as parts (e.g. nose in a face image). On the other hand, NMF and non-linear AE are maimed by slow learning speed and RP only represents a subspace of original data. This paper introduces a dimension reduction framework which to some extend represents data as parts, has fast learning speed, and learns the between-class scatter subspace. To this end, this paper investigates a linear and non-linear dimension reduction framework referred to as extreme learning machine AE (ELM-AE) and sparse ELM-AE (SELM-AE). In contrast to tied weight AE, the hidden neurons in ELM-AE and SELM-AE need not be tuned, and their parameters (e.g, input weights in additive neurons) are initialized using orthogonal and sparse random weights, respectively. Experimental results on USPS handwritten digit recognition data set, CIFAR-10 object recognition, and NORB object recognition data set show the efficacy of linear and non-linear ELM-AE and SELM-AE in terms of discriminative capability, sparsity, training time, and normalized mean square error.

  18. Determination of selection criteria for spray drift reduction from atomization data

    USDA-ARS?s Scientific Manuscript database

    When testing and evaluating drift reduction technologies (DRT), there are different metrics that can be used to determine if the technology reduces drift as compared to a reference system. These metrics can include reduction in percent of fine drops, measured spray drift from a field trial, or comp...

  19. SGLT2 inhibitors: their potential reduction in blood pressure.

    PubMed

    Maliha, George; Townsend, Raymond R

    2015-01-01

    The sodium glucose co-transporter 2 (SGLT2) inhibitors represent a promising treatment option for diabetes and its common comorbidity, hypertension. Emerging data suggests that the SGLT2 inhibitors provide a meaningful reduction in blood pressure, although the precise mechanism of the blood pressure drop remains incompletely elucidated. Based on current data, the blood pressure reduction is partially due to a combination of diuresis, nephron remodeling, reduction in arterial stiffness, and weight loss. While current trials are underway focusing on cardiovascular endpoints, the SGLT2 inhibitors present a novel treatment modality for diabetes and its associated hypertension as well as an opportunity to elucidate the pathophysiology of hypertension in diabetes. Copyright © 2015 American Society of Hypertension. Published by Elsevier Inc. All rights reserved.

  20. Reconciling Scratch Space Consumption, Exposure, and Volatility to Achieve Timely Staging of Job Input Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monti, Henri; Butt, Ali R; Vazhkudai, Sudharshan S

    2010-04-01

    Innovative scientific applications and emerging dense data sources are creating a data deluge for high-end computing systems. Processing such large input data typically involves copying (or staging) onto the supercomputer's specialized high-speed storage, scratch space, for sustained high I/O throughput. The current practice of conservatively staging data as early as possible makes the data vulnerable to storage failures, which may entail re-staging and consequently reduced job throughput. To address this, we present a timely staging framework that uses a combination of job startup time predictions, user-specified intermediate nodes, and decentralized data delivery to coincide input data staging with job start-up.more » By delaying staging to when it is necessary, the exposure to failures and its effects can be reduced. Evaluation using both PlanetLab and simulations based on three years of Jaguar (No. 1 in Top500) job logs show as much as 85.9% reduction in staging times compared to direct transfers, 75.2% reduction in wait time on scratch, and 2.4% reduction in usage/hour.« less

  1. Data reduction software for LORAN-C flight test evaluation

    NASA Technical Reports Server (NTRS)

    Fischer, J. P.

    1979-01-01

    A set of programs designed to be run on an IBM 370/158 computer to read the recorded time differences from the tape produced by the LORAN data collection system, convert them to latitude/longitude and produce various plotting input files are described. The programs were written so they may be tailored easily to meet the demands of a particular data reduction job. The tape reader program is written in 370 assembler language and the remaining programs are written in standard IBM FORTRAN-IV language. The tape reader program is dependent upon the recording format used by the data collection system and on the I/O macros used at the computing facility. The other programs are generally device-independent, although the plotting routines are dependent upon the plotting method used. The data reduction programs convert the recorded data to a more readily usable form; convert the time difference (TD) numbers to latitude/longitude (lat/long), to format a printed listing of the TDs, lat/long, reference times, and other information derived from the data, and produce data files which may be used for subsequent plotting.

  2. Dissolution and reduction of magnetite by bacteria.

    PubMed

    Kostka, J E; Nealson, K H

    1995-10-01

    Magnetite (Fe3O4) is an iron oxide of mixed oxidation state [Fe(II), Fe(III)] that contributes largely to geomagnetism and plays a significant role in diagenesis in marine and freshwater sediments. Magnetic data are the primary evidence for ocean floor spreading and accurate interpretation of the sedimentary magnetic record depends on an understanding of the conditions under which magnetite is stable. Though chemical reduction of magnetite by dissolved sulfide is well known, biological reduction has not been considered likely based upon thermodynamic considerations. This study shows that marine and freshwater strains of the bacterium Shewanella putrefaciens are capable of the rapid dissolution and reduction of magnetite, converting millimolar amounts to soluble Fe(II)in a few days at room temperature. Conditions under which magnetite reduction is optimal (pH 5-6, 22-37 degrees C) are consistent with an enzymatic process and not with simple chemical reduction. Magnetite reduction requires viable cells and cell contact, and it appears to be coupled to electron transport and growth. In a minimal medium with formate or lactate as the electron donor, more than 10 times the amount of magnetite was reduced over no carbon controls. These data suggest that magnetite reduction is coupled to carbon metabolism in S. putrefaciens. Bacterial reduction rates of magnetite are of the same order of magnitude as those estimated for reduction by sulfide. If such remobilization of magnetite occurs in nature, it could have a major impact on sediment magnetism and diagenesis.

  3. Dissolution and reduction of magnetite by bacteria

    NASA Technical Reports Server (NTRS)

    Kostka, J. E.; Nealson, K. H.

    1995-01-01

    Magnetite (Fe3O4) is an iron oxide of mixed oxidation state [Fe(II), Fe(III)] that contributes largely to geomagnetism and plays a significant role in diagenesis in marine and freshwater sediments. Magnetic data are the primary evidence for ocean floor spreading and accurate interpretation of the sedimentary magnetic record depends on an understanding of the conditions under which magnetite is stable. Though chemical reduction of magnetite by dissolved sulfide is well known, biological reduction has not been considered likely based upon thermodynamic considerations. This study shows that marine and freshwater strains of the bacterium Shewanella putrefaciens are capable of the rapid dissolution and reduction of magnetite, converting millimolar amounts to soluble Fe(II)in a few days at room temperature. Conditions under which magnetite reduction is optimal (pH 5-6, 22-37 degrees C) are consistent with an enzymatic process and not with simple chemical reduction. Magnetite reduction requires viable cells and cell contact, and it appears to be coupled to electron transport and growth. In a minimal medium with formate or lactate as the electron donor, more than 10 times the amount of magnetite was reduced over no carbon controls. These data suggest that magnetite reduction is coupled to carbon metabolism in S. putrefaciens. Bacterial reduction rates of magnetite are of the same order of magnitude as those estimated for reduction by sulfide. If such remobilization of magnetite occurs in nature, it could have a major impact on sediment magnetism and diagenesis.

  4. ORBS: A reduction software for SITELLE and SpiOMM data

    NASA Astrophysics Data System (ADS)

    Martin, Thomas

    2014-09-01

    ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).

  5. Strain expansion-reduction approach

    NASA Astrophysics Data System (ADS)

    Baqersad, Javad; Bharadwaj, Kedar

    2018-02-01

    Validating numerical models are one of the main aspects of engineering design. However, correlating million degrees of freedom of numerical models to the few degrees of freedom of test models is challenging. Reduction/expansion approaches have been traditionally used to match these degrees of freedom. However, the conventional reduction/expansion approaches are only limited to displacement, velocity or acceleration data. While in many cases only strain data are accessible (e.g. when a structure is monitored using strain-gages), the conventional approaches are not capable of expanding strain data. To bridge this gap, the current paper outlines a reduction/expansion technique to reduce/expand strain data. In the proposed approach, strain mode shapes of a structure are extracted using the finite element method or the digital image correlation technique. The strain mode shapes are used to generate a transformation matrix that can expand the limited set of measurement data. The proposed approach can be used to correlate experimental and analytical strain data. Furthermore, the proposed technique can be used to expand real-time operating data for structural health monitoring (SHM). In order to verify the accuracy of the approach, the proposed technique was used to expand the limited set of real-time operating data in a numerical model of a cantilever beam subjected to various types of excitations. The proposed technique was also applied to expand real-time operating data measured using a few strain gages mounted to an aluminum beam. It was shown that the proposed approach can effectively expand the strain data at limited locations to accurately predict the strain at locations where no sensors were placed.

  6. The U.S. geological survey rass-statpac system for management and statistical reduction of geochemical data

    USGS Publications Warehouse

    VanTrump, G.; Miesch, A.T.

    1977-01-01

    RASS is an acronym for Rock Analysis Storage System and STATPAC, for Statistical Package. The RASS and STATPAC computer programs are integrated into the RASS-STATPAC system for the management and statistical reduction of geochemical data. The system, in its present form, has been in use for more than 9 yr by scores of U.S. Geological Survey geologists, geochemists, and other scientists engaged in a broad range of geologic and geochemical investigations. The principal advantage of the system is the flexibility afforded the user both in data searches and retrievals and in the manner of statistical treatment of data. The statistical programs provide for most types of statistical reduction normally used in geochemistry and petrology, but also contain bridges to other program systems for statistical processing and automatic plotting. ?? 1977.

  7. Tensor sufficient dimension reduction

    PubMed Central

    Zhong, Wenxuan; Xing, Xin; Suslick, Kenneth

    2015-01-01

    Tensor is a multiway array. With the rapid development of science and technology in the past decades, large amount of tensor observations are routinely collected, processed, and stored in many scientific researches and commercial activities nowadays. The colorimetric sensor array (CSA) data is such an example. Driven by the need to address data analysis challenges that arise in CSA data, we propose a tensor dimension reduction model, a model assuming the nonlinear dependence between a response and a projection of all the tensor predictors. The tensor dimension reduction models are estimated in a sequential iterative fashion. The proposed method is applied to a CSA data collected for 150 pathogenic bacteria coming from 10 bacterial species and 14 bacteria from one control species. Empirical performance demonstrates that our proposed method can greatly improve the sensitivity and specificity of the CSA technique. PMID:26594304

  8. The CHARIS Integral Field Spectrograph with SCExAO: Data Reduction and Performance

    NASA Astrophysics Data System (ADS)

    Kasdin, N. Jeremy; Groff, Tyler; Brandt, Timothy; Currie, Thayne; Rizzo, Maxime; Chilcote, Jeffrey K.; Guyon, Olivier; Jovanovic, Nemanja; Lozi, Julien; Norris, Barnaby; Tamura, Motohide

    2018-01-01

    We summarize the data reduction pipeline and on-sky performance of the CHARIS Integral Field Spectrograph behind the SCExAO Adaptive Optics system on the Subaru Telescope. The open-source pipeline produces data cubes from raw detector reads using a Χ^2-based spectral extraction technique. It implements a number of advances, including a fit to the full nonlinear pixel response, suppression of up to a factor of ~2 in read noise, and deconvolution of the spectra with the line-spread function. The CHARIS team is currently developing the calibration and postprocessing software that will comprise the second component of the data reduction pipeline. Here, we show a range of CHARIS images, spectra, and contrast curves produced using provisional routines. CHARIS is now characterizing exoplanets simultaneously across the J, H, and K bands.

  9. ORAC-DR -- spectroscopy data reduction

    NASA Astrophysics Data System (ADS)

    Hirst, Paul; Cavanagh, Brad

    ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce spectroscopy data collected at the United Kingdom Infrared Telescope (UKIRT) with the CGS4, UIST and Michelle instruments, at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument, and from the Very Large Telescope with ISAAC. It outlines the algorithms used and how to make minor modifications of them, and how to correct for errors made at the telescope.

  10. A systematic comparison of the closed shoulder reduction techniques.

    PubMed

    Alkaduhimi, H; van der Linde, J A; Willigenburg, N W; van Deurzen, D F P; van den Bekerom, M P J

    2017-05-01

    To identify the optimal technique for closed reduction for shoulder instability, based on success rates, reduction time, complication risks, and pain level. A PubMed and EMBASE query was performed, screening all relevant literature of closed reduction techniques mentioning the success rate written in English, Dutch, German, and Arabic. Studies with a fracture dislocation or lacking information on success rates for closed reduction techniques were excluded. We used the modified Coleman Methodology Score (CMS) to assess the quality of included studies and excluded studies with a poor methodological quality (CMS < 50). Finally, a meta-analysis was performed on the data from all studies combined. 2099 studies were screened for their title and abstract, of which 217 studies were screened full-text and finally 13 studies were included. These studies included 9 randomized controlled trials, 2 retrospective comparative studies, and 2 prospective non-randomized comparative studies. A combined analysis revealed that the scapular manipulation is the most successful (97%), fastest (1.75 min), and least painful reduction technique (VAS 1,47); the "Fast, Reliable, and Safe" (FARES) method also scores high in terms of successful reduction (92%), reduction time (2.24 min), and intra-reduction pain (VAS 1.59); the traction-countertraction technique is highly successful (95%), but slower (6.05 min) and more painful (VAS 4.75). For closed reduction of anterior shoulder dislocations, the combined data from the selected studies indicate that scapular manipulation is the most successful and fastest technique, with the shortest mean hospital stay and least pain during reduction. The FARES method seems the best alternative.

  11. Water in the Balance: A Parking Lot Story

    NASA Astrophysics Data System (ADS)

    Haas, N. A.; Vitousek, S.

    2017-12-01

    The greater Chicagoland region has seen a high degree of urbanization since 1970. For example, between 1970-1990 the region experienced 4% population growth, a 35% increase in urban land use, and approximately 454 square miles of agricultural land was mostly converted into urban uses. Transformation of land into urban uses in the Chicagoland region has altered the stream and catchment response to rainfall events, specifically an increase in stream flashiness and increase in urban flooding. Chicago has begun to address these changes through green infrastructure. To understand the impact of green infrastructure at local, city-wide, and watershed scales, individual projects need to be accurately and sufficiently modeled. A traditional parking lot conversion into a porous parking lot at the University of Illinois at Chicago was modeled using SWMM and scrutinized using field data to look at stormwater runoff and water balance prior and post reconstruction. SWMM modeling suggested an 87% reduction in peak flow as well as a 100% reduction in flooding for a 24 hour, 1.72-inch storm. For the same storm, field data suggest an 89% reduction in peak flow as well as a 100% reduction in flooding. Modeling suggested 100% reductions in flooding for longer duration storms (24 hour+) and a smaller reduction in peak flow ( 66%). The highly parameterized SWMM model agrees well with collected data and analysis. Further effort is being made to use data mining to create correlations within the collected datasets that can be integrated into a model that follows a standardized formation process and reduces parameterization.

  12. Decreasing boron concentrations in UK rivers: insights into reductions in detergent formulations since the 1990s and within-catchment storage issues.

    PubMed

    Neal, Colin; Williams, Richard J; Bowes, Michael J; Harrass, Michael C; Neal, Margaret; Rowland, Philip; Wickham, Heather; Thacker, Sarah; Harman, Sarah; Vincent, Colin; Jarvie, Helen P

    2010-02-15

    The changing patterns of riverine boron concentration are examined for the Thames catchment in southern/southeastern England using data from 1997 to 2007. Boron concentrations are related to an independent marker for sewage effluent, sodium. The results show that boron concentrations in the main river channels have declined with time especially under baseflow conditions when sewage effluent dilution potential is at its lowest. While boron concentrations have reduced, especially under low-flow conditions, this does not fully translate to a corresponding reduction in boron flux and it seems that the "within-catchment" supplies of boron to the river are contaminated by urban sources. The estimated boron reduction in the effluent input to the river based on the changes in river chemistry is typically around 60% and this figure matches with an initial survey of more limited data for the industrial north of England. Data for effluent concentrations at eight sewage treatment works within the Kennet also indicate substantial reductions in boron concentrations: 80% reduction occurred between 2001 and 2008. For the more contaminated rivers there are issues of localised rather than catchment-wide sources and uncertainties over the extent and nature of water/boron stores. Atmospheric sources average around 32 to 61% for the cleaner and 4 to 14% for the more polluted parts. The substantial decreases in the boron concentrations correspond extremely well with the timing and extent of European wide trends for reductions in the industrial and domestic usage of boron-bearing compounds. It clearly indicates that such reductions have translated into lower average and peak concentrations of boron in the river although the full extent of these reductions has probably not yet occurred due to localised stores that are still to deplete.

  13. Cheetah: software for high-throughput reduction and analysis of serial femtosecond X-ray diffraction data

    PubMed Central

    Barty, Anton; Kirian, Richard A.; Maia, Filipe R. N. C.; Hantke, Max; Yoon, Chun Hong; White, Thomas A.; Chapman, Henry

    2014-01-01

    The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License. PMID:24904246

  14. [Panel data analysis of health status in Northeast Brazil].

    PubMed

    Sousa, Tanara Rosângela Vieira; Leite Filho, Paulo Amilton Maia

    2008-10-01

    To assess health status determinants in Brazil's Northeast states. Study carried out based on panel data analysis of aggregated information for municipalities. Data was obtained from the United Nations Development Program Atlas of Human Development and Brazilian National Treasury Department for the years 1991 and 2000. Health status indicator was infant mortality rate and health determinants were the following variables: per capita health and sanitation expenditure; number of physicians per inhabitant; access to drinking water; fertility rate; illiteracy rate; percentage of adolescent mothers; per capita income; and Gini coefficient. Infant mortality rates in Northeast Brazil were reduced by 31.8%, during the period studied, slightly above the national average. However, in some states, such as Rio Grande do Norte, Bahia, Ceará and Alagoas, the reduction was more significant. This can be attributed to improvement in some indicators that are main determinants of infant mortality rate reduction: greater access to education, reduction of fertility rates, increased income, and access to drinking water. Brazilian states that showed greater gains in access to drinking water, education, income and reduction of fertility rates were also the ones that achieved major reductions in mortality of children under a year of age.

  15. Constant temperature hot wire anemometry data reduction procedure

    NASA Technical Reports Server (NTRS)

    Klopfer, G. H.

    1974-01-01

    The theory and data reduction procedure for constant temperature hot wire anemometry are presented. The procedure is valid for all Mach and Prandtl numbers, but limited to Reynolds numbers based on wire diameter between 0.1 and 300. The fluids are limited to gases which approximate ideal gas behavior. Losses due to radiation, free convection and conduction are included.

  16. Development of a data reduction expert assistant

    NASA Technical Reports Server (NTRS)

    Miller, Glenn E.

    1994-01-01

    This report documents the development and deployment of the Data Reduction Expert Assistant (DRACO). The system was successfully applied to two astronomical research projects. The first was the removal of cosmic ray artifacts from Hubble Space Telescope (HST) Wide Field Planetary Camera data. The second was the reduction and calibration of low-dispersion CCD spectra taken from a ground-based telescope. This has validated our basic approach and demonstrated the applicability of this technology. This work has been made available to the scientific community in two ways. First, we have published the work in the scientific literature and presented papers at relevant conferences. Secondly, we have made the entire system (including documentation and source code) available to the community via the World Wide Web.

  17. The significant reduction of precipitation in Southern China during the Chinese Spring Festival

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Gong, D.

    2016-12-01

    Long-term observational data from 2001 to 2012 over 339 stations were used to analyze the precipitation in southern China during the Chinese Spring Festival (CSF). It reveals both the precipitation frequency and precipitation intensity have a significant reduction around CSF holiday. From the second day to the sixth day after the Lunar New Year's Day, the daily mean precipitation frequency anomaly is -9%. At the same time, more than 90% stations in the study area have negative anomalies. The precipitation intensity has a continuous reduction from day 2 to day 4, which is up to 2mm in day 3. Other relevant variables, such as relative humidity and sunshine duration, have corresponding results to the precipitation's reduction during CSF. Atmospheric water vapor field's change leads to the reduction phenomenon. We analyzed the circulation configuration using the ERA-interim reanalysis data. It shows the anomalous north wind decrease the vapor and further affects the precipitation during the CSF period. The pollutants' concentration decreased around CSF, which may influence the meteorological field and lead to the anomalous north wind. Based on the S2S (sub-seasonal to seasonal prediction project) data, we calculated the circulation forecast difference to CSF period between clean days and polluted days. The result proves the north wind's existence and suggests that the aerosol decrease because of human activity may be partly responsible for the precipitation reduction during CSF.

  18. AST Critical Propulsion and Noise Reduction Technologies for Future Commercial Subsonic Engines: Separate-Flow Exhaust System Noise Reduction Concept Evaluation

    NASA Technical Reports Server (NTRS)

    Janardan, B. A.; Hoff, G. E.; Barter, J. W.; Martens, S.; Gliebe, P. R.; Mengle, V.; Dalton, W. N.; Saiyed, Naseem (Technical Monitor)

    2000-01-01

    This report describes the work performed by General Electric Aircraft Engines (GEAE) and Allison Engine Company (AEC) on NASA Contract NAS3-27720 AoI 14.3. The objective of this contract was to generate quality jet noise acoustic data for separate-flow nozzle models and to design and verify new jet-noise-reduction concepts over a range of simulated engine cycles and flight conditions. Five baseline axisymmetric separate-flow nozzle models having bypass ratios of five and eight with internal and external plugs and 11 different mixing-enhancer model nozzles (including chevrons, vortex-generator doublets, and a tongue mixer) were designed and tested in model scale. Using available core and fan nozzle hardware in various combinations, 28 GEAE/AEC separate-flow nozzle/mixing-enhancer configurations were acoustically evaluated in the NASA Glenn Research Center Aeroacoustic and Propulsion Laboratory. This report describes model nozzle features, facility and data acquisition/reduction procedures, the test matrix, and measured acoustic data analyses. A number of tested core and fan mixing enhancer devices and combinations of devices gave significant jet noise reduction relative to separate-flow baseline nozzles. Inward-flip and alternating-flip core chevrons combined with a straight-chevron fan nozzle exceeded the NASA stretch goal of 3 EPNdB jet noise reduction at typical sideline certification conditions.

  19. Comparison of photogrammetric and astrometric data reduction results for the wild BC-4 camera

    NASA Technical Reports Server (NTRS)

    Hornbarger, D. H.; Mueller, I., I.

    1971-01-01

    The results of astrometric and photogrammetric plate reduction techniques for a short focal length camera are compared. Several astrometric models are tested on entire and limited plate areas to analyze their ability to remove systematic errors from interpolated satellite directions using a rigorous photogrammetric reduction as a standard. Residual plots are employed to graphically illustrate the analysis. Conclusions are made as to what conditions will permit the astrometric reduction to achieve comparable accuracies to those of photogrammetric reduction when applied for short focal length ballistic cameras.

  20. Noise Reduction by Signal Accumulation

    ERIC Educational Resources Information Center

    Kraftmakher, Yaakov

    2006-01-01

    The aim of this paper is to show how the noise reduction by signal accumulation can be accomplished with a data acquisition system. This topic can be used for student projects. In many cases, the noise reduction is an unavoidable part of experimentation. Several techniques are known for this purpose, and among them the signal accumulation is the…

  1. The impact of vessel speed reduction on port accidents.

    PubMed

    Chang, Young-Tae; Park, Hyosoo

    2016-03-19

    Reduced-speed zones (RSZs) have been designated across the world to control emissions from ships and prevent mammal strikes. While some studies have examined the effectiveness of speed reduction on emissions and mammal preservation, few have analyzed the effects of reduced ship speed on vessel safety. Those few studies have not yet measured the relationship between vessel speed and accidents by using real accident data. To fill this gap in the literature, this study estimates the impact of vessel speed reduction on vessel damages, casualties and frequency of vessel accidents. Accidents in RSZ ports were compared to non-RSZ ports by using U.S. Coast Guard data to capture the speed reduction effects. The results show that speed reduction influenced accident frequency as a result of two factors, the fuel price and the RSZ designation. Every $10 increase in the fuel price led to a 10.3% decrease in the number of accidents, and the RSZ designation reduced vessel accidents by 47.9%. However, the results do not clarify the exact impact of speed reduction on accident casualty. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. STS-1 Pogo analysis

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Some of the pogo related data from STS-1 are documented. The measurements and data reduction are described. In the data analysis reference is made to FRF and single engine test results. The measurements are classified under major project elements of the space shuttle main engine, the external tank, and the orbiter. The subsystems are structural dynamics and main propulsion. Data were recorded onboard the orbiter with a minimum response rate of 1.5 to 50 Hz. The wideband, 14 track recorder was used, and the data required demultiplexing before reduction. The flight phase of interest was from liftoff through main engine cutoff.

  3. The SPHERE Data Center: a reference for high contrast imaging processing

    NASA Astrophysics Data System (ADS)

    Delorme, P.; Meunier, N.; Albert, D.; Lagadec, E.; Le Coroller, H.; Galicher, R.; Mouillet, D.; Boccaletti, A.; Mesa, D.; Meunier, J.-C.; Beuzit, J.-L.; Lagrange, A.-M.; Chauvin, G.; Sapone, A.; Langlois, M.; Maire, A.-L.; Montargès, M.; Gratton, R.; Vigan, A.; Surace, C.

    2017-12-01

    The objective of the SPHERE Data Center is to optimize the scientific return of SPHERE at the VLT, by providing optimized reduction procedures, services to users and publicly available reduced data. This paper describes our motivation, the implementation of the service (partners, infrastructure and developments), services, description of the on-line data, and future developments. The SPHERE Data Center is operational and has already provided reduced data with a good reactivity to many observers. The first public reduced data have been made available in 2017. The SPHERE Data Center is gathering a strong expertise on SPHERE data and is in a very good position to propose new reduced data in the future, as well as improved reduction procedures.

  4. Development of the EarthChem Geochronology and Thermochronology database: Collaboration of the EarthChem and EARTHTIME efforts

    NASA Astrophysics Data System (ADS)

    Walker, J. D.; Ash, J. M.; Bowring, J.; Bowring, S. A.; Deino, A. L.; Kislitsyn, R.; Koppers, A. A.

    2009-12-01

    One of the most onerous tasks in rigorous development of data reporting and databases for geochronological and thermochronological studies is to fully capture all of the metadata needed to completely document both the analytical work as well as the interpretation effort. This information is available in the data reduction programs used by researchers, but has proven difficult to harvest into either publications or databases. For this reason, the EarthChem and EARTHTIME efforts are collaborating to foster the next generation of data management and discovery for age information by integrating data reporting with data reduction. EarthChem is a community-driven effort to facilitate the discovery, access, and preservation of geochemical data of all types and to support research and enable new and better science. EARTHTIME is also a community-initiated project whose aim is to foster the next generation of high-precision geochronology and thermochoronology. In addition, collaboration with the CRONUS effort for cosmogenic radionuclides is in progress. EarthChem workers have met with groups working on the Ar-Ar, U-Pb, and (U-Th)/He systems to establish data reporting requirements as well as XML schemas to be used for transferring data from reduction programs to database. At present, we have prototype systems working for the U-Pb_Redux, ArArCalc, MassSpec, and Helios programs. In each program, the user can select to upload data and metadata to the GEOCHRON system hosted at EarthChem. There are two additional requirements for upload. The first is having a unique identifier (IGSN) obtained either manually or via web services contained within the reduction program from the SESAR system. The second is that the user selects whether the sample is to be available for discovery (public) or remain hidden (private). Search for data at the GEOCHRON portal can be done using age, method, mineral, or location parameters. Data can be downloaded in the full XML format for ingestion back into the reduction program or as abbreviated tables.

  5. Reduction of variable-truncation artifacts from beam occlusion during in situ x-ray tomography

    NASA Astrophysics Data System (ADS)

    Borg, Leise; Jørgensen, Jakob S.; Frikel, Jürgen; Sporring, Jon

    2017-12-01

    Many in situ x-ray tomography studies require experimental rigs which may partially occlude the beam and cause parts of the projection data to be missing. In a study of fluid flow in porous chalk using a percolation cell with four metal bars drastic streak artifacts arise in the filtered backprojection (FBP) reconstruction at certain orientations. Projections with non-trivial variable truncation caused by the metal bars are the source of these variable-truncation artifacts. To understand the artifacts a mathematical model of variable-truncation data as a function of metal bar radius and distance to sample is derived and verified numerically and with experimental data. The model accurately describes the arising variable-truncation artifacts across simulated variations of the experimental setup. Three variable-truncation artifact-reduction methods are proposed, all aimed at addressing sinogram discontinuities that are shown to be the source of the streaks. The ‘reduction to limited angle’ (RLA) method simply keeps only non-truncated projections; the ‘detector-directed smoothing’ (DDS) method smooths the discontinuities; while the ‘reflexive boundary condition’ (RBC) method enforces a zero derivative at the discontinuities. Experimental results using both simulated and real data show that the proposed methods effectively reduce variable-truncation artifacts. The RBC method is found to provide the best artifact reduction and preservation of image features using both visual and quantitative assessment. The analysis and artifact-reduction methods are designed in context of FBP reconstruction motivated by computational efficiency practical for large, real synchrotron data. While a specific variable-truncation case is considered, the proposed methods can be applied to general data cut-offs arising in different in situ x-ray tomography experiments.

  6. Aerodynamic investigations into various low speed L/D improvement devices on the 140A/B space shuttle orbiter configuration in the Rockwell International low speed wind tunnel (OA86)

    NASA Technical Reports Server (NTRS)

    Mennell, R. C.

    1974-01-01

    Tests were conducted to investigate various base drag reduction techniques in an attempt to improve Orbiter lift-to-drag ratios and to calculate sting interference effects on the Orbiter aerodynamic characteristics. Test conditions and facilites, and model dimensional data are presented along with the data reduction guidelines and data set/run number collation used for the studies. Aerodynamic force and moment data and the results of stability and control tests are also given.

  7. The thirteenth data release of the Sloan Digital Sky Survey: First spectroscopic data from the SDSS-IV survey mapping nearby galaxies at Apache Point Observatory

    DOE PAGES

    Franco D. Albareti

    2017-12-08

    The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in July 2014. It pursues three core programs: APOGEE-2, MaNGA, and eBOSS. In addition, eBOSS contains two major subprograms: TDSS and SPIDERS. This paper describes the first data release from SDSS-IV, Data Release 13 (DR13), which contains new data, reanalysis of existing data sets and, like all SDSS data releases, is inclusive of previously released data. DR13 makes publicly available 1390 spatially resolved integral field unit observations of nearby galaxies from MaNGA, the first data released from this survey. It includes new observations from eBOSS, completing SEQUELS. Inmore » addition to targeting galaxies and quasars, SEQUELS also targeted variability-selected objects from TDSS and X-ray selected objects from SPIDERS. DR13 includes new reductions of the SDSS-III BOSS data, improving the spectrophotometric calibration and redshift classification. DR13 releases new reductions of the APOGEE-1 data from SDSS-III, with abundances of elements not previously included and improved stellar parameters for dwarf stars and cooler stars. For the SDSS imaging data, DR13 provides new, more robust and precise photometric calibrations. Several value-added catalogs are being released in tandem with DR13, in particular target catalogs relevant for eBOSS, TDSS, and SPIDERS, and an updated red-clump catalog for APOGEE. This paper describes the location and format of the data now publicly available, as well as providing references to the important technical papers that describe the targeting, observing, and data reduction. In conclusion, the SDSS website, this http URL, provides links to the data, tutorials and examples of data access, and extensive documentation of the reduction and analysis procedures. DR13 is the first of a scheduled set that will contain new data and analyses from the planned ~6-year operations of SDSS-IV.« less

  8. The thirteenth data release of the Sloan Digital Sky Survey: First spectroscopic data from the SDSS-IV survey mapping nearby galaxies at Apache Point Observatory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franco D. Albareti

    The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in July 2014. It pursues three core programs: APOGEE-2, MaNGA, and eBOSS. In addition, eBOSS contains two major subprograms: TDSS and SPIDERS. This paper describes the first data release from SDSS-IV, Data Release 13 (DR13), which contains new data, reanalysis of existing data sets and, like all SDSS data releases, is inclusive of previously released data. DR13 makes publicly available 1390 spatially resolved integral field unit observations of nearby galaxies from MaNGA, the first data released from this survey. It includes new observations from eBOSS, completing SEQUELS. Inmore » addition to targeting galaxies and quasars, SEQUELS also targeted variability-selected objects from TDSS and X-ray selected objects from SPIDERS. DR13 includes new reductions of the SDSS-III BOSS data, improving the spectrophotometric calibration and redshift classification. DR13 releases new reductions of the APOGEE-1 data from SDSS-III, with abundances of elements not previously included and improved stellar parameters for dwarf stars and cooler stars. For the SDSS imaging data, DR13 provides new, more robust and precise photometric calibrations. Several value-added catalogs are being released in tandem with DR13, in particular target catalogs relevant for eBOSS, TDSS, and SPIDERS, and an updated red-clump catalog for APOGEE. This paper describes the location and format of the data now publicly available, as well as providing references to the important technical papers that describe the targeting, observing, and data reduction. In conclusion, the SDSS website, this http URL, provides links to the data, tutorials and examples of data access, and extensive documentation of the reduction and analysis procedures. DR13 is the first of a scheduled set that will contain new data and analyses from the planned ~6-year operations of SDSS-IV.« less

  9. ESO Advanced Data Products for the Virtual Observatory

    NASA Astrophysics Data System (ADS)

    Retzlaff, J.; Delmotte, N.; Rite, C.; Rosati, P.; Slijkhuis, R.; Vandame, B.

    2006-07-01

    Advanced Data Products, that is, completely reduced, fully characterized science-ready data sets, play a crucial role for the success of the Virtual Observatory as a whole. We report on on-going work at ESO towards the creation and publication of Advanced Data Products in compliance with present VO standards on resource metadata. The new deep NIR multi-color mosaic of the GOODS/CDF-S region is used to showcase different aspects of the entire process: data reduction employing our MVM-based reduction pipeline, calibration and data characterization procedures, standardization of metadata content, and, finally, a prospect of the scientific potential illustrated by new results on deep galaxy number counts.

  10. Off-line data reduction

    NASA Astrophysics Data System (ADS)

    Gutowski, Marek W.

    1992-12-01

    Presented is a novel, heuristic algorithm, based on fuzzy set theory, allowing for significant off-line data reduction. Given the equidistant data, the algorithm discards some points while retaining others with their original values. The fraction of original data points retained is typically {1}/{6} of the initial value. The reduced data set preserves all the essential features of the input curve. It is possible to reconstruct the original information to high degree of precision by means of natural cubic splines, rational cubic splines or even linear interpolation. Main fields of application should be non-linear data fitting (substantial savings in CPU time) and graphics (storage space savings).

  11. Content Abstract Classification Using Naive Bayes

    NASA Astrophysics Data System (ADS)

    Latif, Syukriyanto; Suwardoyo, Untung; Aldrin Wihelmus Sanadi, Edwin

    2018-03-01

    This study aims to classify abstract content based on the use of the highest number of words in an abstract content of the English language journals. This research uses a system of text mining technology that extracts text data to search information from a set of documents. Abstract content of 120 data downloaded at www.computer.org. Data grouping consists of three categories: DM (Data Mining), ITS (Intelligent Transport System) and MM (Multimedia). Systems built using naive bayes algorithms to classify abstract journals and feature selection processes using term weighting to give weight to each word. Dimensional reduction techniques to reduce the dimensions of word counts rarely appear in each document based on dimensional reduction test parameters of 10% -90% of 5.344 words. The performance of the classification system is tested by using the Confusion Matrix based on comparative test data and test data. The results showed that the best classification results were obtained during the 75% training data test and 25% test data from the total data. Accuracy rates for categories of DM, ITS and MM were 100%, 100%, 86%. respectively with dimension reduction parameters of 30% and the value of learning rate between 0.1-0.5.

  12. The GONG Data Reduction and Analysis System. [solar oscillations

    NASA Technical Reports Server (NTRS)

    Pintar, James A.; Andersen, Bo Nyborg; Andersen, Edwin R.; Armet, David B.; Brown, Timothy M.; Hathaway, David H.; Hill, Frank; Jones, Harrison P.

    1988-01-01

    Each of the six GONG observing stations will produce three, 16-bit, 256X256 images of the Sun every 60 sec of sunlight. These data will be transferred from the observing sites to the GONG Data Management and Analysis Center (DMAC), in Tucson, on high-density tapes at a combined rate of over 1 gibabyte per day. The contemporaneous processing of these data will produce several standard data products and will require a sustained throughput in excess of 7 megaflops. Peak rates may exceed 50 megaflops. Archives will accumulate at the rate of approximately 1 terabyte per year, reaching nearly 3 terabytes in 3 yr of observing. Researchers will access the data products with a machine-independent GONG Reduction and Analysis Software Package (GRASP). Based on the Image Reduction and Analysis Facility, this package will include database facilities and helioseismic analysis tools. Users may access the data as visitors in Tucson, or may access DMAC remotely through networks, or may process subsets of the data at their local institutions using GRASP or other systems of their choice. Elements of the system will reach the prototype stage by the end of 1988. Full operation is expected in 1992 when data acquisition begins.

  13. VESUVIO Data Analysis Goes MANTID

    NASA Astrophysics Data System (ADS)

    Jackson, S.; Krzystyniak, M.; Seel, A. G.; Gigg, M.; Richards, S. E.; Fernandez-Alonso, F.

    2014-12-01

    This paper describes ongoing efforts to implement the reduction and analysis of neutron Compton scattering data within the MANTID framework. Recently, extensive work has been carried out to integrate the bespoke data reduction and analysis routines written for VESUVIO with the MANTID framework. While the programs described in this document are designed to replicate the functionality of the Fortran and Genie routines already in use, most of them have been written from scratch and are not based on the original code base.

  14. Data Reduction and Analysis from the SOHO Spacecraft

    NASA Technical Reports Server (NTRS)

    Ipavich, F. M.

    1999-01-01

    This paper presents a final report on Data Reduction and Analysis from The SOHO Spacecraft from November 1, 1996-October 31, 1999. The topics include: 1) Instrumentation; 2) Health of Instrument; 3) Solar Wind Web Page; 3) Data Analysis; and 4) Science. This paper also includes appendices describing routine SOHO (Solar and Heliospheric Observatory) tasks, SOHO Science Procedures in the UMTOF (University Mass Determining Time-of-Flight) System, SOHO Programs on UMTOF and a list of publications.

  15. Impact of hydrogeological data on measures of uncertainty, site characterization and environmental performance metrics

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Ezzedine, Souheil; Rubin, Yoram

    2012-02-01

    The significance of conditioning predictions of environmental performance metrics (EPMs) on hydrogeological data in heterogeneous porous media is addressed. Conditioning EPMs on available data reduces uncertainty and increases the reliability of model predictions. We present a rational and concise approach to investigate the impact of conditioning EPMs on data as a function of the location of the environmentally sensitive target receptor, data types and spacing between measurements. We illustrate how the concept of comparative information yield curves introduced in de Barros et al. [de Barros FPJ, Rubin Y, Maxwell R. The concept of comparative information yield curves and its application to risk-based site characterization. Water Resour Res 2009;45:W06401. doi:10.1029/2008WR007324] could be used to assess site characterization needs as a function of flow and transport dimensionality and EPMs. For a given EPM, we show how alternative uncertainty reduction metrics yield distinct gains of information from a variety of sampling schemes. Our results show that uncertainty reduction is EPM dependent (e.g., travel times) and does not necessarily indicate uncertainty reduction in an alternative EPM (e.g., human health risk). The results show how the position of the environmental target, flow dimensionality and the choice of the uncertainty reduction metric can be used to assist in field sampling campaigns.

  16. Alternative Fuels Data Center: Active Transit

    Science.gov Websites

    : Active Transit on AddThis.com... More in this section... Idle Reduction Parts & Equipment Maintenance Reduction Parts & Equipment Maintenance Driving Behavior Fleet Rightsizing System Efficiency Locate

  17. The Cost-Effectiveness of Surgical Fixation of Distal Radial Fractures: A Computer Model-Based Evaluation of Three Operative Modalities.

    PubMed

    Rajan, Prashant V; Qudsi, Rameez A; Dyer, George S M; Losina, Elena

    2018-02-07

    There is no consensus on the optimal fixation method for patients who require a surgical procedure for distal radial fractures. We used cost-effectiveness analyses to determine which of 3 modalities offers the best value: closed reduction and percutaneous pinning, open reduction and internal fixation, or external fixation. We developed a Markov model that projected short-term and long-term health benefits and costs in patients undergoing a surgical procedure for a distal radial fracture. Simulations began at the patient age of 50 years and were run over the patient's lifetime. The analysis was conducted from health-care payer and societal perspectives. We estimated transition probabilities and quality-of-life values from the literature and determined costs from Medicare reimbursement schedules in 2016 U.S. dollars. Suboptimal postoperative outcomes were determined by rates of reduction loss (4% for closed reduction and percutaneous pinning, 1% for open reduction and internal fixation, and 11% for external fixation) and rates of orthopaedic complications. Procedural costs were $7,638 for closed reduction and percutaneous pinning, $10,170 for open reduction and internal fixation, and $9,886 for external fixation. Outputs were total costs and quality-adjusted life-years (QALYs), discounted at 3% per year. We considered willingness-to-pay thresholds of $50,000 and $100,000. We conducted deterministic and probabilistic sensitivity analyses to evaluate the impact of data uncertainty. From the health-care payer perspective, closed reduction and percutaneous pinning dominated (i.e., produced greater QALYs at lower costs than) open reduction and internal fixation and dominated external fixation. From the societal perspective, the incremental cost-effectiveness ratio for closed reduction and percutaneous pinning compared with open reduction and internal fixation was $21,058 per QALY and external fixation was dominated. In probabilistic sensitivity analysis, open reduction and internal fixation was cost-effective roughly 50% of the time compared with roughly 45% for closed reduction and percutaneous pinning. When considering data uncertainty, there is only a 5% to 10% difference in the frequency of probability combinations that find open reduction and internal fixation to be more cost-effective. The current degree of uncertainty in the data produces difficulty in distinguishing either strategy as being more cost-effective overall and thus it may be left to surgeon and patient shared decision-making. Economic Level III. See Instructions for Authors for a complete description of levels of evidence.

  18. Youpi: YOUr processing PIpeline

    NASA Astrophysics Data System (ADS)

    Monnerville, Mathias; Sémah, Gregory

    2012-03-01

    Youpi is a portable, easy to use web application providing high level functionalities to perform data reduction on scientific FITS images. Built on top of various open source reduction tools released to the community by TERAPIX (http://terapix.iap.fr), Youpi can help organize data, manage processing jobs on a computer cluster in real time (using Condor) and facilitate teamwork by allowing fine-grain sharing of results and data. Youpi is modular and comes with plugins which perform, from within a browser, various processing tasks such as evaluating the quality of incoming images (using the QualityFITS software package), computing astrometric and photometric solutions (using SCAMP), resampling and co-adding FITS images (using SWarp) and extracting sources and building source catalogues from astronomical images (using SExtractor). Youpi is useful for small to medium-sized data reduction projects; it is free and is published under the GNU General Public License.

  19. Sample size and power calculations for detecting changes in malaria transmission using antibody seroconversion rate.

    PubMed

    Sepúlveda, Nuno; Paulino, Carlos Daniel; Drakeley, Chris

    2015-12-30

    Several studies have highlighted the use of serological data in detecting a reduction in malaria transmission intensity. These studies have typically used serology as an adjunct measure and no formal examination of sample size calculations for this approach has been conducted. A sample size calculator is proposed for cross-sectional surveys using data simulation from a reverse catalytic model assuming a reduction in seroconversion rate (SCR) at a given change point before sampling. This calculator is based on logistic approximations for the underlying power curves to detect a reduction in SCR in relation to the hypothesis of a stable SCR for the same data. Sample sizes are illustrated for a hypothetical cross-sectional survey from an African population assuming a known or unknown change point. Overall, data simulation demonstrates that power is strongly affected by assuming a known or unknown change point. Small sample sizes are sufficient to detect strong reductions in SCR, but invariantly lead to poor precision of estimates for current SCR. In this situation, sample size is better determined by controlling the precision of SCR estimates. Conversely larger sample sizes are required for detecting more subtle reductions in malaria transmission but those invariantly increase precision whilst reducing putative estimation bias. The proposed sample size calculator, although based on data simulation, shows promise of being easily applicable to a range of populations and survey types. Since the change point is a major source of uncertainty, obtaining or assuming prior information about this parameter might reduce both the sample size and the chance of generating biased SCR estimates.

  20. The automated data processing architecture for the GPI Exoplanet Survey

    NASA Astrophysics Data System (ADS)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  1. Alternative Fuels Data Center

    Science.gov Websites

    pounds to compensate for the additional weight of the idle reduction technology. Upon request, vehicle operators must provide proof that the idle reduction technology is fully functional. (Reference Alaska

  2. Insurance, Public Assistance, and Household Flood Risk Reduction: A Comparative Study of Austria, England, and Romania.

    PubMed

    Hanger, Susanne; Linnerooth-Bayer, Joanne; Surminski, Swenja; Nenciu-Posner, Cristina; Lorant, Anna; Ionescu, Radu; Patt, Anthony

    2018-04-01

    In light of increasing losses from floods, many researchers and policymakers are looking for ways to encourage flood risk reduction among communities, business, and households. In this study, we investigate risk-reduction behavior at the household level in three European Union Member States with fundamentally different insurance and compensation schemes. We try to understand if and how insurance and public assistance influence private risk-reduction behavior. Data were collected using a telephone survey (n = 1,849) of household decisionmakers in flood-prone areas. We show that insurance overall is positively associated with private risk-reduction behavior. Warranties, premium discounts, and information provision with respect to risk reduction may be an explanation for this positive relationship in the case of structural measures. Public incentives for risk-reduction measures by means of financial and in-kind support, and particularly through the provision of information, are also associated with enhancing risk reduction. In this study, public compensation is not negatively associated with private risk-reduction behavior. This does not disprove such a relationship, but the negative effect may be mitigated by factors related to respondents' capacity to implement measures or social norms that were not included in the analysis. The data suggest that large-scale flood protection infrastructure creates a sense of security that is associated with a lower level of preparedness. Across the board there is ample room to improve both public and private policies to provide effective incentives for household-level risk reduction. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  3. Demonstration of short haul aircraft aft noise reduction techniques on a twenty inch (50.8 cm) diameter fan, volume 3

    NASA Technical Reports Server (NTRS)

    Stimpert, D. L.

    1975-01-01

    Tests of a twenty inch diameter, low tip speed, low pressure ratio fan which investigated aft fan noise reduction techniques are reported. The 1/3 octave band sound data are presented for all the configurations tested. The model data are presented on 17 foot arc and extrapolated to 200 foot sideline.

  4. Trauma-Informed Guilt Reduction (TrIGR) Intervention

    DTIC Science & Technology

    2017-10-01

    AWARD NUMBER: W81XWH-15-1-0331 TITLE: Trauma- Informed Guilt Reduction (TrIGR) Intervention PRINCIPAL INVESTIGATOR: Christy Capone, PhD...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments

  5. Are There Hidden Supernovae?

    NASA Technical Reports Server (NTRS)

    Bregman, Jesse; Harker, David; Dunham, E.; Rank, David; Temi, Pasquale

    1997-01-01

    Ames Research Center and UCSC have been working on the development of a Mid IR Camera for the KAO in order to search for extra galactic supernovae. The development of the camera and its associated data reduction software have been successfully completed. Spectral Imaging of the Orion Bar at 6.2 and 7.8 microns demonstrates the derotation and data reduction software which was developed.

  6. NASA Office of Aeronautics and Space Technology Summer Workshop. Volume 1: Data processing and transfer panel

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The data processing and transfer technology areas that need to be developed and that could benefit from space flight experiments are identified. Factors considered include: user requirements, concepts in 'Outlook for Space', and cost reduction. Major program thrusts formulated are an increase in end-to-end information handling and a reduction in life cycle costs.

  7. Improving Prediction Accuracy for WSN Data Reduction by Applying Multivariate Spatio-Temporal Correlation

    PubMed Central

    Carvalho, Carlos; Gomes, Danielo G.; Agoulmine, Nazim; de Souza, José Neuman

    2011-01-01

    This paper proposes a method based on multivariate spatial and temporal correlation to improve prediction accuracy in data reduction for Wireless Sensor Networks (WSN). Prediction of data not sent to the sink node is a technique used to save energy in WSNs by reducing the amount of data traffic. However, it may not be very accurate. Simulations were made involving simple linear regression and multiple linear regression functions to assess the performance of the proposed method. The results show a higher correlation between gathered inputs when compared to time, which is an independent variable widely used for prediction and forecasting. Prediction accuracy is lower when simple linear regression is used, whereas multiple linear regression is the most accurate one. In addition to that, our proposal outperforms some current solutions by about 50% in humidity prediction and 21% in light prediction. To the best of our knowledge, we believe that we are probably the first to address prediction based on multivariate correlation for WSN data reduction. PMID:22346626

  8. A sparse grid based method for generative dimensionality reduction of high-dimensional data

    NASA Astrophysics Data System (ADS)

    Bohn, Bastian; Garcke, Jochen; Griebel, Michael

    2016-03-01

    Generative dimensionality reduction methods play an important role in machine learning applications because they construct an explicit mapping from a low-dimensional space to the high-dimensional data space. We discuss a general framework to describe generative dimensionality reduction methods, where the main focus lies on a regularized principal manifold learning variant. Since most generative dimensionality reduction algorithms exploit the representer theorem for reproducing kernel Hilbert spaces, their computational costs grow at least quadratically in the number n of data. Instead, we introduce a grid-based discretization approach which automatically scales just linearly in n. To circumvent the curse of dimensionality of full tensor product grids, we use the concept of sparse grids. Furthermore, in real-world applications, some embedding directions are usually more important than others and it is reasonable to refine the underlying discretization space only in these directions. To this end, we employ a dimension-adaptive algorithm which is based on the ANOVA (analysis of variance) decomposition of a function. In particular, the reconstruction error is used to measure the quality of an embedding. As an application, the study of large simulation data from an engineering application in the automotive industry (car crash simulation) is performed.

  9. Smoking in pregnancy in West Virginia: does cessation/reduction improve perinatal outcomes?

    PubMed

    Seybold, Dara J; Broce, Mike; Siegel, Eric; Findley, Joseph; Calhoun, Byron C

    2012-01-01

    To determine if pregnant women decreasing/quitting tobacco use will have improved fetal outcomes. Retrospective analysis of pregnant smokers from 6/1/2006-12/31/2007 who received prenatal care and delivered at a tertiary medical care center in West Virginia. Variables analyzed included birth certificate data linked to intervention program survey data. Patients were divided into four study groups: <8 cigarettes/day-no reduction, <8 cigarettes/day-reduction, ≥8 cigarettes/day-no reduction, and ≥8 cigarettes/day-reduction. Analysis performed using ANOVA one-way test for continuous variables and Chi-square for categorical variables. Inclusion criteria met by 250 patients. Twelve women (4.8%) quit smoking; 150 (60%) reduced; 27 (10.8%) increased; and 61 (24.4%) had no change. Comparing the four study groups for pre-term births (<37 weeks), 25% percent occurred in ≥8 no reduction group while 10% occurred in ≥8 with reduction group (P = 0.026). The high rate of preterm birth (25%) in the non-reducing group depended on 2 factors: (1) ≥8 cigarettes/day at beginning and (2) no reduction by the end of prenatal care. Finally, there was a statistically significant difference in birth weights between the two groups: ≥8 cigarettes/day with no reduction (2,872.6 g) versus <8 cigarettes/day with reduction (3,212.4 g) (P = 0.028). Smoking reduction/cessation lowered risk of pre-term delivery (<37 weeks) twofold. Encouraging patients who smoke ≥8 cigarettes/day during pregnancy to decrease/quit prior to delivery provides significant clinical benefit by decreasing the likelihood of preterm birth. These findings support tobacco cessation efforts as a means to improve birth outcome.

  10. Automated data processing and radioassays.

    PubMed

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots for radioreceptor assay are limited by calculation of a single mean K value. The quality of the input data is generally the limiting factor in achieving good precision with automated as it is with manual data reduction. The major advantages of computerized curve fitting include: (1) handling large amounts of data rapidly and without computational error; (2) providing useful quality-control data; (3) indicating within-batch variance of the test results; (4) providing ongoing quality-control charts and between assay variance.

  11. The economic cost of using restraint and the value added by restraint reduction or elimination.

    PubMed

    Lebel, Janice; Goldstein, Robert

    2005-09-01

    The purpose of this study was to calculate the economic cost of using restraint on one adolescent inpatient service and to examine the effect of an initiative to reduce or eliminate the use of restraint after it was implemented. A detailed process-task analysis of mechanical, physical, and medication-based restraint was conducted in accordance with state and federal restraint requirements. Facility restraint data were collected, verified, and analyzed. A model was developed to determine the cost and duration of an average episode for each type of restraint. Staff time allocated to restraint activities and medication costs were computed. Calculation of the cost of restraint was restricted to staff and medication costs. Aggregate costs of restraint use and staff-related costs for one full year before the restraint reduction initiative (FY 2000) and one full year after the initiative (FY 2003) were calculated. Outcome, discharge, and recidivism data were analyzed. A comparison of the FY 2000 data with the FY 2003 data showed that the adolescent inpatient service's aggregate use of restraint decreased from 3,991 episodes to 373 episodes (91 percent), which was associated with a reduction in the cost of restraint from $1,446,740 to $117,036 (a 92 percent reduction). In addition, sick time, staff turnover and replacement costs, workers' compensation, injuries to adolescents and staff, and recidivism decreased. Adolescent Global Assessment of Functioning scores at discharge significantly improved. Implementation of a restraint reduction initiative was associated with a reduction in the use of restraint, staff time devoted to restraint, and staff-related costs. This shift appears to have contributed to better outcomes for adolescents, fewer injuries to adolescents and staff, and lower staff turnover. The initiative may have enhanced adolescent treatment and work conditions for staff.

  12. Aircraft Piston Engine Exhaust Emission Symposium

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A 2-day symposium on the reduction of exhaust emissions from aircraft piston engines was held on September 14 and 15, 1976, at the Lewis Research Center in Cleveland, Ohio. Papers were presented by both government organizations and the general aviation industry on the status of government contracts, emission measurement problems, data reduction procedures, flight testing, and emission reduction techniques.

  13. Kinetics of chromate reduction during naphthalene degradation in a mixed culture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shen, H.; Sewell, G.W.; Pritchard, P.H.

    A mixed culture of Bacillus sp. K1 and Sphingomonas paucimobilis EPA 505 was exposed to chromate and naphthalene. Batch experiments showed that chromate was reduced and naphthalene was degraded by the mixed culture. Chromate reduction occurred initially at a high rate followed by a decrease in rate until chromate reduction ceased. Chromate reduction decreased in the mixed culture when a lower ratio of S. paucimobilis EPA 505 to Bacillus sp. K1 was utilized. A kinetic model incorporating a term for the cell density ratio is proposed to describe chromate reduction in the mixed culture under both chromate limited and electronmore » donor limited conditions. The validity of the model, and its parameter values, was verified by experimental data generated under a variety of initial population compositions and a broad range of chromate concentrations. The consistent result of experimental data with model predictions implies that the model is useful for evaluating the interactions and the use of mixed culture for chromate removal.« less

  14. Gene Selection and Cancer Classification: A Rough Sets Based Approach

    NASA Astrophysics Data System (ADS)

    Sun, Lijun; Miao, Duoqian; Zhang, Hongyun

    Indentification of informative gene subsets responsible for discerning between available samples of gene expression data is an important task in bioinformatics. Reducts, from rough sets theory, corresponding to a minimal set of essential genes for discerning samples, is an efficient tool for gene selection. Due to the compuational complexty of the existing reduct algoritms, feature ranking is usually used to narrow down gene space as the first step and top ranked genes are selected . In this paper,we define a novel certierion based on the expression level difference btween classes and contribution to classification of the gene for scoring genes and present a algorithm for generating all possible reduct from informative genes.The algorithm takes the whole attribute sets into account and find short reduct with a significant reduction in computational complexity. An exploration of this approach on benchmark gene expression data sets demonstrates that this approach is successful for selecting high discriminative genes and the classification accuracy is impressive.

  15. Summary of transformation equations and equations of motion used in free flight and wind tunnel data reduction and analysis

    NASA Technical Reports Server (NTRS)

    Gainer, T. G.; Hoffman, S.

    1972-01-01

    Basic formulations for developing coordinate transformations and motion equations used with free-flight and wind-tunnel data reduction are presented. The general forms presented include axes transformations that enable transfer back and forth between any of the five axes systems that are encountered in aerodynamic analysis. Equations of motion are presented that enable calculation of motions anywhere in the vicinity of the earth. A bibliography of publications on methods of analyzing flight data is included.

  16. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    NASA Technical Reports Server (NTRS)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  17. User's guide to the UTIL-ODRC tape processing program. [for the Orbital Data Reduction Center

    NASA Technical Reports Server (NTRS)

    Juba, S. M. (Principal Investigator)

    1981-01-01

    The UTIL-ODRC computer compatible tape processing program, its input/output requirements, and its interface with the EXEC 8 operating system are described. It is a multipurpose orbital data reduction center (ODRC) tape processing program enabling the user to create either exact duplicate tapes and/or tapes in SINDA/HISTRY format. Input data elements for PRAMPT/FLOPLT and/or BATCH PLOT programs, a temperature summary, and a printed summary can also be produced.

  18. Weight reduction for non-alcoholic fatty liver disease.

    PubMed

    Peng, Lijun; Wang, Jiyao; Li, Feng

    2011-06-15

    Non-alcoholic fatty liver disease (NAFLD) is becoming a wide spread liver disease. The present recommendations for treatment are not evidence-based. Some of them are various weight reduction measures with diet, exercise, drug, or surgical therapy. To assess the benefits and harms of intended weight reduction for patients with NAFLD. We searched The Cochrane Hepato-Biliary Group Controlled Trials Register, The Cochrane Central Register of Controlled Trials (CENTRAL) in The Cochrane Library, PubMed, EMBASE, Science Citation Index Expanded, Chinese Biomedicine Database, and ClinicalTrials.gov until February 2011. We included randomised clinical trials evaluating weight reduction with different measures versus no intervention or placebo in NAFLD patients. We extracted data independently. We calculated the odds ratio (OR) for dichotomous data and calculated the mean difference (MD) for continuous data, both with 95% confidence intervals (CI). The review includes seven trials; five on aspects of lifestyle changes (eg, diet, physical exercise) and two on treatment with a weight reduction drug 'orlistat'. In total, 373 participants were enrolled, and the duration of the trials ranged from 1 month to 1 year. Only one trial on lifestyle programme was judged to be of low risk of bias. We could not perform meta-analyses for the main outcomes as they were either not reported or there were insufficient number of trials for each outcome to be meta-analysed. We could meta-analyse the available data for body weight and body mass index only. Adverse events were poorly reported. The sparse data and high risk of bias preclude us from drawing any definite conclusion on lifestyle programme or orlistat for treatment of NAFLD. Further randomised clinical trials with low risk of bias are needed to test the beneficial and harmful effects of weight reduction for NAFLD patients. The long-term prognosis of development of fibrosis, mortality, and quality of life should be studied.

  19. Characterizing Long-term Contaminant Mass Discharge and the Relationship Between Reductions in Discharge and Reductions in Mass for DNAPL Source Areas

    PubMed Central

    Matthieu, D.E.; Carroll, K.C.; Mainhagu, J.; Morrison, C.; McMillan, A.; Russo, A.; Plaschke, M.

    2013-01-01

    The objective of this study was to characterize the temporal behavior of contaminant mass discharge, and the relationship between reductions in contaminant mass discharge and reductions in contaminant mass, for a very heterogeneous, highly contaminated source-zone field site. Trichloroethene is the primary contaminant of concern, and several lines of evidence indicate the presence of organic liquid in the subsurface. The site is undergoing groundwater extraction for source control, and contaminant mass discharge has been monitored since system startup. The results show a significant reduction in contaminant mass discharge with time, decreasing from approximately 1 to 0.15 kg/d. Two methods were used to estimate the mass of contaminant present in the source area at the initiation of the remediation project. One was based on a comparison of two sets of core data, collected 3.5 years apart, which suggests that a significant (~80%) reduction in aggregate sediment-phase TCE concentrations occurred between sampling events. The second method was based on fitting the temporal contaminant mass discharge data with a simple exponential source-depletion function. Relatively similar estimates, 784 and 993 kg, respectively, were obtained with the two methods. These data were used to characterize the relationship between reductions in contaminant mass discharge (CMDR) and reductions in contaminant mass (MR). The observed curvilinear relationship exhibits a reduction in contaminant mass discharge essentially immediately upon initiation of mass reduction. This behavior is consistent with a system wherein significant quantities of mass are present in hydraulically poorly accessible domains for which mass removal is influenced by rate-limited mass transfer. The results obtained from the present study are compared to those obtained from other field studies to evaluate the impact of system properties and conditions on mass-discharge and mass-removal behavior. The results indicated that factors such as domain scale, hydraulic-gradient status (induced or natural), and flushing-solution composition had insignificant impact on the CMDR-MR profiles and thus on underlying mass-removal behavior. Conversely, source-zone age, through its impact on contaminant distribution and accessibility, was implicated as a critical factor influencing the nature of the CMDR-MR relationship. PMID:23528743

  20. Program documentation for the space environment test division post-test data reduction program (GNFLEX)

    NASA Technical Reports Server (NTRS)

    Jones, L. D.

    1979-01-01

    The Space Environment Test Division Post-Test Data Reduction Program processes data from test history tapes generated on the Flexible Data System in the Space Environment Simulation Laboratory at the National Aeronautics and Space Administration/Lyndon B. Johnson Space Center. The program reads the tape's data base records to retrieve the item directory conversion file, the item capture file and the process link file to determine the active parameters. The desired parameter names are read in by lead cards after which the periodic data records are read to determine parameter data level changes. The data is considered to be compressed rather than full sample rate. Tabulations and/or a tape for generating plots may be output.

  1. Comparison of Amount of Primary Tooth Reduction Required for Anterior and Posterior Zirconia and Stainless Steel Crowns.

    PubMed

    Clark, Larkin; Wells, Martha H; Harris, Edward F; Lou, Jennifer

    2016-01-01

    To determine if aggressiveness of primary tooth preparation varied among different brands of zirconia and stainless steel (SSC) crowns. One hundred primary typodont teeth were divided into five groups (10 posterior and 10 anterior) and assigned to: Cheng Crowns (CC); EZ Pedo (EZP); Kinder Krowns (KKZ); NuSmile (NSZ); and SSC. Teeth were prepared, and assigned crowns were fitted. Teeth were weighed prior to and after preparation. Weight changes served as a surrogate measure of tooth reduction. Analysis of variance showed a significant difference in tooth reduction among brand/type for both the anterior and posterior. Tukey's honest significant difference test (HSD), when applied to anterior data, revealed that SSCs required significantly less tooth removal compared to the composite of the four zirconia brands, which showed no significant difference among them. Tukey's HSD test, applied to posterior data, revealed that CC required significantly greater removal of crown structure, while EZP, KKZ, and NSZ were statistically equivalent, and SSCs required significantly less removal. Zirconia crowns required more tooth reduction than stainless steel crowns for primary anterior and posterior teeth. Tooth reduction for anterior zirconia crowns was equivalent among brands. For posterior teeth, reduction for three brands (EZ Pedo, Kinder Krowns, NuSmile) did not differ, while Cheng Crowns required more reduction.

  2. Boiler Briquette Coal versus Raw Coal: Part II-Energy, Greenhouse Gas, and Air Quality Implications.

    PubMed

    Zhang, Junfeng; Ge, Su; Bai, Zhipeng

    2001-04-01

    The objective of this paper is to conduct an integrated analysis of the energy, greenhouse gas, and air quality impacts of a new type of boiler briquette coal (BB-coal) in contrast to those of the raw coal from which the BB-coal was formulated (R-coal). The analysis is based on the source emissions data and other relevant data collected in the present study and employs approaches including the construction of carbon, energy, and sulfur balances. The results show that replacing R-coal with BB-coal as the fuel for boilers such as the one tested would have multiple benefits, including a 37% increase in boiler thermal efficiency, a 25% reduction in fuel demand, a 26% reduction in CO 2 emission, a 17% reduction in CO emission, a 63% reduction in SO 2 emission, a 97% reduction in fly ash and fly ash carbon emission, a 22% reduction in PM 2.5 mass emission, and a 30% reduction in total emission of five toxic hazardous air pollutant (HAP) metals contained in PM 10 . These benefits can be achieved with no changes in boiler hardware and with a relatively small amount of tradeoffs: a 30% increase in PM 10 mass emission and a 9-16% increase in fuel cost.

  3. Boiler briquette coal versus raw coal: Part II--Energy, greenhouse gas, and air quality implications.

    PubMed

    Zhang, J; Ge, S; Bai, Z

    2001-04-01

    The objective of this paper is to conduct an integrated analysis of the energy, greenhouse gas, and air quality impacts of a new type of boiler briquette coal (BB-coal) in contrast to those of the raw coal from which the BB-coal was formulated (R-coal). The analysis is based on the source emissions data and other relevant data collected in the present study and employs approaches including the construction of carbon, energy, and sulfur balances. The results show that replacing R-coal with BB-coal as the fuel for boilers such as the one tested would have multiple benefits, including a 37% increase in boiler thermal efficiency, a 25% reduction in fuel demand, a 26% reduction in CO2 emission, a 17% reduction in CO emission, a 63% reduction in SO2 emission, a 97% reduction in fly ash and fly ash carbon emission, a 22% reduction in PM2.5 mass emission, and a 30% reduction in total emission of five toxic hazardous air pollutant (HAP) metals contained in PM10. These benefits can be achieved with no changes in boiler hardware and with a relatively small amount of tradeoffs: a 30% increase in PM10 mass emission and a 9-16% increase in fuel cost.

  4. Participation in the Infrared Space Observatory (ISO) Mission

    NASA Technical Reports Server (NTRS)

    Joseph, Robert D.

    2002-01-01

    All the Infrared Space Observatory (ISO) data have been transmitted from the ISO Data Centre, reduced, and calibrated. This has been rather labor-intensive as new calibrations for both the ISOPHOT and ISOCAM data have been released and the algorithms for data reduction have improved. We actually discovered errors in the calibration in earlier versions of the software. However the data reduction improvements have now converged and we have a self-consistent, well-calibrated database. It has also been a major effort to obtain the ground-based JHK imaging, 450 micrometer and 850 micrometer imaging and the 1-2.5 micrometer near-infrared spectroscopy for most of the sample galaxies.

  5. Observations of flat-spectrum radio sources at λ850 μm from the James Clerk Maxwell Telescope - I. 1997 April to 2000 April

    NASA Astrophysics Data System (ADS)

    Robson, E. I.; Stevens, J. A.; Jenness, T.

    2001-11-01

    Calibrated data for 65 flat-spectrum extragalactic radio sources are presented at a wavelength of 850μm, covering a three-year period from 1997 April. The data, obtained from the James Clerk Maxwell Telescope using the SCUBA camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control-Data Reduction (orac-dr) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves.

  6. No compelling evidence that sibutramine prolongs life in rodents despite providing a dose-dependent reduction in body weight

    PubMed Central

    Smith, Daniel L.; Robertson, Henry; Desmond, Renee; Nagy, Tim R.; Allison, David B.

    2010-01-01

    Objective The health and longevity effects of body weight reduction resulting from exercise and caloric restriction in rodents are well known, but less is known about whether similar effects occur with weight reduction from the use of a pharmaceutical agent such as sibutramine, a serotonin-norepinephrine reuptake inhibitor. Results & Conclusion Using data from a two-year toxicology study of sibutramine in CD rats and CD-1 mice, despite a dose-dependent reduction in food intake and body weight in rats compared to controls, and a body weight reduction in mice at the highest dose, there was no compelling evidence for reductions in mortality rate. PMID:21079617

  7. Meta-analysis of free-response studies, 1992-2008: assessing the noise reduction model in parapsychology.

    PubMed

    Storm, Lance; Tressoldi, Patrizio E; Di Risio, Lorenzo

    2010-07-01

    We report the results of meta-analyses on 3 types of free-response study: (a) ganzfeld (a technique that enhances a communication anomaly referred to as "psi"); (b) nonganzfeld noise reduction using alleged psi-enhancing techniques such as dream psi, meditation, relaxation, or hypnosis; and (c) standard free response (nonganzfeld, no noise reduction). For the period 1997-2008, a homogeneous data set of 29 ganzfeld studies yielded a mean effect size of 0.142 (Stouffer Z = 5.48, p = 2.13 x 10(-8)). A homogeneous nonganzfeld noise reduction data set of 16 studies yielded a mean effect size of 0.110 (Stouffer Z = 3.35, p = 2.08 x 10(-4)), and a homogeneous data set of 14 standard free-response studies produced a weak negative mean effect size of -0.029 (Stouffer Z = -2.29, p = .989). The mean effect size value of the ganzfeld database was significantly higher than the mean effect size of the standard free-response database but was not higher than the effect size of the nonganzfeld noise reduction database [corrected].We also found that selected participants (believers in the paranormal, meditators, etc.) had a performance advantage over unselected participants, but only if they were in the ganzfeld condition.

  8. Waste reduction through consumer education. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, E.Z.

    The Waste Reduction through Consumer Education research project was conducted to determine how environmental educational strategies influence purchasing behavior in the supermarket. The objectives were to develop, demonstrate, and evaluate consumer education strategies for waste reduction. The amount of waste generated by packaging size and form, with an adjustment for local recyclability of waste, was determined for 14 product categories identified as having more waste generating and less waste generating product choices (a total of 484 products). Using supermarket scan data and shopper identification numbers, the research tracked the purchases of shoppers in groups receiving different education treatments for 9more » months. Statistical tests applied to the purchase data assessed patterns of change between the groups by treatment period. Analysis of the data revealed few meaningful statistical differences between study groups or changes in behavior over time. Findings suggest that broad brush consumer education about waste reduction is not effective in changing purchasing behaviors in the short term. However, it may help create a general awareness of the issues surrounding excess packaging and consumer responsibility. The study concludes that the answer to waste reduction in the future may be a combination of voluntary initiatives by manufacturers and retailers, governmental intervention, and better-informed consumers.« less

  9. Special raster scanning for reduction of charging effects in scanning electron microscopy.

    PubMed

    Suzuki, Kazuhiko; Oho, Eisaku

    2014-01-01

    A special raster scanning (SRS) method for reduction of charging effects is developed for the field of SEM. Both a conventional fast scan (horizontal direction) and an unusual scan (vertical direction) are adopted for acquiring raw data consisting of many sub-images. These data are converted to a proper SEM image using digital image processing techniques. About sharpness of the image and reduction of charging effects, the SRS is compared with the conventional fast scan (with frame-averaging) and the conventional slow scan. Experimental results show the effectiveness of SRS images. By a successful combination of the proposed scanning method and low accelerating voltage (LV)-SEMs, it is expected that higher-quality SEM images can be more easily acquired by the considerable reduction of charging effects, while maintaining the resolution. © 2013 Wiley Periodicals, Inc.

  10. Predicting Reduction Rates of Energetic Nitroaromatic Compounds Using Calculated One-Electron Reduction Potentials

    DOE PAGES

    Salter-Blanc, Alexandra; Bylaska, Eric J.; Johnston, Hayley; ...

    2015-02-11

    The evaluation of new energetic nitroaromatic compounds (NACs) for use in green munitions formulations requires models that can predict their environmental fate. The susceptibility of energetic NACs to nitro reduction might be predicted from correlations between rate constants (k) for this reaction and one-electron reduction potentials (E1NAC) / 0.059 V, but the mechanistic implications of such correlations are inconsistent with evidence from other methods. To address this inconsistency, we have reevaluated existing kinetic data using a (non-linear) free-energy relationship (FER) based on the Marcus theory of outer-sphere electron transfer. For most reductants, the results are inconsistent with rate limitation bymore » an initial, outer-sphere electron transfer, suggesting that the strong correlation between k and E1NAC is justified only as an empirical model. This empirical correlation was used to calibrate a new quantitative structure-activity relationship (QSAR) using previously reported values of k for non-energetic NAC reduction by Fe(II) porphyrin and newly reported values of E1NAC determined using density functional theory at the B3LYP/6-311++G(2d,2p) level with the COSMO solvation model. The QSAR was then validated for energetic NACs using newly measured kinetic data for 2,4,6-trinitrotoluene (TNT), 2,4-dinitrotoluene (2,4-DNT), and 2,4-dinitroanisole (DNAN). The data show close agreement with the QSAR, supporting its applicability to energetic NACs.« less

  11. Wavelet data compression for archiving high-resolution icosahedral model data

    NASA Astrophysics Data System (ADS)

    Wang, N.; Bao, J.; Lee, J.

    2011-12-01

    With the increase of the resolution of global circulation models, it becomes ever more important to develop highly effective solutions to archive the huge datasets produced by those models. While lossless data compression guarantees the accuracy of the restored data, it can only achieve limited reduction of data size. Wavelet transform based data compression offers significant potentials in data size reduction, and it has been shown very effective in transmitting data for remote visualizations. However, for data archive purposes, a detailed study has to be conducted to evaluate its impact to the datasets that will be used in further numerical computations. In this study, we carried out two sets of experiments for both summer and winter seasons. An icosahedral grid weather model and a highly efficient wavelet data compression software were used for this study. Initial conditions were compressed and input to the model to run to 10 days. The forecast results were then compared to those forecast results from the model run with the original uncompressed initial conditions. Several visual comparisons, as well as the statistics of numerical comparisons are presented. These results indicate that with specified minimum accuracy losses, wavelet data compression achieves significant data size reduction, and at the same time, it maintains minimum numerical impacts to the datasets. In addition, some issues are discussed to increase the archive efficiency while retaining a complete set of meta data for each archived file.

  12. Trauma Informed Guilt Reduction (TrIGR) Intervention

    DTIC Science & Technology

    2017-10-01

    AWARD NUMBER: W81XWH-15-1-0330 TITLE: Trauma- Informed Guilt Reduction (TrIGR) Intervention PRINCIPAL INVESTIGATOR: Sonya Norman, PhD CONTRACTING...Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions...searching existing data sources, gathering and maintaining the data needed, and completing and reviewing this collection of information . Send comments

  13. COED Transactions, Vol. X, No. 6, June 1978. Concentric-Tube Heat Exchanger Analysis and Data Reduction.

    ERIC Educational Resources Information Center

    Marcovitz, Alan B., Ed.

    Four computer programs written in FORTRAN and BASIC develop theoretical predictions and data reduction for a junior-senior level heat exchanger experiment. Programs may be used at the terminal in the laboratory to check progress of the experiment or may be used in the batch mode for interpretation of final information for a formal report. Several…

  14. 45 CFR 261.44 - When must a State report the required data on the caseload reduction credit?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 2 2013-10-01 2012-10-01 true When must a State report the required data on the caseload reduction credit? 261.44 Section 261.44 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES ENSURING THAT...

  15. Automated Reduction of Data from Images and Holograms

    NASA Technical Reports Server (NTRS)

    Lee, G. (Editor); Trolinger, James D. (Editor); Yu, Y. H. (Editor)

    1987-01-01

    Laser techniques are widely used for the diagnostics of aerodynamic flow and particle fields. The storage capability of holograms has made this technique an even more powerful. Over 60 researchers in the field of holography, particle sizing and image processing convened to discuss these topics. The research program of ten government laboratories, several universities, industry and foreign countries were presented. A number of papers on holographic interferometry with applications to fluid mechanics were given. Several papers on combustion and particle sizing, speckle velocimetry and speckle interferometry were given. A session on image processing and automated fringe data reduction techniques and the type of facilities for fringe reduction was held.

  16. Source Identification and Location Techniques

    NASA Technical Reports Server (NTRS)

    Weir, Donald; Bridges, James; Agboola, Femi; Dougherty, Robert

    2001-01-01

    Mr. Weir presented source location results obtained from an engine test as part of the Engine Validation of Noise Reduction Concepts program. Two types of microphone arrays were used in this program to determine the jet noise source distribution for the exhaust from a 4.3 bypass ratio turbofan engine. One was a linear array of 16 microphones located on a 25 ft. sideline and the other was a 103 microphone 3-D "cage" array in the near field of the jet. Data were obtained from a baseline nozzle and from numerous nozzle configuration using chevrons and/or tabs to reduce the jet noise. Mr. Weir presented data from two configurations: the baseline nozzle and a nozzle configuration with chevrons on both the core and bypass nozzles. This chevron configuration had achieved a jet noise reduction of 4 EPNdB in small scale tests conducted at the Glenn Research Center. IR imaging showed that the chevrons produced significant improvements in mixing and greatly reduced the length of the jet potential core. Comparison of source location data from the 1-D phased array showed a shift of the noise sources towards the nozzle and clear reductions of the sources due to the noise reduction devices. Data from the 3-D array showed a single source at a frequency of 125 Hz. located several diameters downstream from the nozzle exit. At 250 and 400 Hz., multiple sources, periodically spaced, appeared to exist downstream of the nozzle. The trend of source location moving toward the nozzle exit with increasing frequency was also observed. The 3-D array data also showed a reduction in source strength with the addition of chevrons. The overall trend of source location with frequency was compared for the two arrays and with classical experience. Similar trends were observed. Although overall trends with frequency and addition of suppression devices were consistent between the data from the 1-D and the 3-D arrays, a comparison of the details of the inferred source locations did show differences. A flight test is planned to determine if the hardware tested statically will achieve similar reductions in flight.

  17. Towards tracer dose reduction in PET studies: Simulation of dose reduction by retrospective randomized undersampling of list-mode data.

    PubMed

    Gatidis, Sergios; Würslin, Christian; Seith, Ferdinand; Schäfer, Jürgen F; la Fougère, Christian; Nikolaou, Konstantin; Schwenzer, Nina F; Schmidt, Holger

    2016-01-01

    Optimization of tracer dose regimes in positron emission tomography (PET) imaging is a trade-off between diagnostic image quality and radiation exposure. The challenge lies in defining minimal tracer doses that still result in sufficient diagnostic image quality. In order to find such minimal doses, it would be useful to simulate tracer dose reduction as this would enable to study the effects of tracer dose reduction on image quality in single patients without repeated injections of different amounts of tracer. The aim of our study was to introduce and validate a method for simulation of low-dose PET images enabling direct comparison of different tracer doses in single patients and under constant influencing factors. (18)F-fluoride PET data were acquired on a combined PET/magnetic resonance imaging (MRI) scanner. PET data were stored together with the temporal information of the occurrence of single events (list-mode format). A predefined proportion of PET events were then randomly deleted resulting in undersampled PET data. These data sets were subsequently reconstructed resulting in simulated low-dose PET images (retrospective undersampling of list-mode data). This approach was validated in phantom experiments by visual inspection and by comparison of PET quality metrics contrast recovery coefficient (CRC), background-variability (BV) and signal-to-noise ratio (SNR) of measured and simulated PET images for different activity concentrations. In addition, reduced-dose PET images of a clinical (18)F-FDG PET dataset were simulated using the proposed approach. (18)F-PET image quality degraded with decreasing activity concentrations with comparable visual image characteristics in measured and in corresponding simulated PET images. This result was confirmed by quantification of image quality metrics. CRC, SNR and BV showed concordant behavior with decreasing activity concentrations for measured and for corresponding simulated PET images. Simulation of dose-reduced datasets based on clinical (18)F-FDG PET data demonstrated the clinical applicability of the proposed data. Simulation of PET tracer dose reduction is possible with retrospective undersampling of list-mode data. Resulting simulated low-dose images have equivalent characteristics with PET images actually measured at lower doses and can be used to derive optimal tracer dose regimes.

  18. Data Mining of Network Logs

    NASA Technical Reports Server (NTRS)

    Collazo, Carlimar

    2011-01-01

    The statement of purpose is to analyze network monitoring logs to support the computer incident response team. Specifically, gain a clear understanding of the Uniform Resource Locator (URL) and its structure, and provide a way to breakdown a URL based on protocol, host name domain name, path, and other attributes. Finally, provide a method to perform data reduction by identifying the different types of advertisements shown on a webpage for incident data analysis. The procedures used for analysis and data reduction will be a computer program which would analyze the URL and identify and advertisement links from the actual content links.

  19. RECOZ data reduction and analysis: Programs and procedures

    NASA Technical Reports Server (NTRS)

    Reed, E. I.

    1984-01-01

    The RECOZ data reduction programs transform data from the RECOZ photometer to ozone number density and overburden as a function of altitude. Required auxiliary data are the altitude profile versus time and for appropriate corrections to the ozone cross sections and scattering effects, air pressure and temperature profiles. Air temperature and density profiles may also be used to transform the ozone density versus geometric altitude to other units, such as to ozone partial pressure or mixing ratio versus pressure altitude. There are seven programs used to accomplish this: RADAR, LISTRAD, RAW OZONE, EDIT OZONE, MERGE, SMOOTH, and PROFILE.

  20. Airborne data measurement system errors reduction through state estimation and control optimization

    NASA Astrophysics Data System (ADS)

    Sebryakov, G. G.; Muzhichek, S. M.; Pavlov, V. I.; Ermolin, O. V.; Skrinnikov, A. A.

    2018-02-01

    The paper discusses the problem of airborne data measurement system errors reduction through state estimation and control optimization. The approaches are proposed based on the methods of experiment design and the theory of systems with random abrupt structure variation. The paper considers various control criteria as applied to an aircraft data measurement system. The physics of criteria is explained, the mathematical description and the sequence of steps for each criterion application is shown. The formula is given for airborne data measurement system state vector posterior estimation based for systems with structure variations.

  1. Kinetic modeling of liquefied petroleum gas (LPG) reduction of titania in MATLAB

    NASA Astrophysics Data System (ADS)

    Yin, Tan Wei; Ramakrishnan, Sivakumar; Rezan, Sheikh Abdul; Noor, Ahmad Fauzi Mohd; Izah Shoparwe, Noor; Alizadeh, Reza; Roohi, Parham

    2017-04-01

    In the present study, reduction of Titania (TiO2) by liquefied petroleum gas (LPG)-hydrogen-argon gas mixture was investigated by experimental and kinetic modelling in MATLAB. The reduction experiments were carried out in the temperature range of 1100-1200°C with a reduction time from 1-3 hours and 10-20 minutes of LPG flowing time. A shrinking core model (SCM) was employed for the kinetic modelling in order to determine the rate and extent of reduction. The highest experimental extent of reduction of 38% occurred at a temperature of 1200°C with 3 hours reduction time and 20 minutes of LPG flowing time. The SCM gave a predicted extent of reduction of 82.1% due to assumptions made in the model. The deviation between SCM and experimental data was attributed to porosity, thermodynamic properties and minute thermal fluctuations within the sample. In general, the reduction rates increased with increasing reduction temperature and LPG flowing time.

  2. Data reduction, radial velocities and stellar parameters from spectra in the very low signal-to-noise domain

    NASA Astrophysics Data System (ADS)

    Malavolta, Luca

    2013-10-01

    Large astronomical facilities usually provide data reduction pipeline designed to deliver ready-to-use scientific data, and too often as- tronomers are relying on this to avoid the most difficult part of an astronomer job Standard data reduction pipelines however are usu- ally designed and tested to have good performance on data with av- erage Signal to Noise Ratio (SNR) data, and the issues that are related with the reduction of data in the very low SNR domain are not taken int account properly. As a result, informations in data with low SNR are not optimally exploited. During the last decade our group has collected thousands of spec- tra using the GIRAFFE spectrograph at Very Large Telescope (Chile) of the European Southern Observatory (ESO) to determine the ge- ometrical distance and dynamical state of several Galactic Globular Clusters but ultimately the analysis has been hampered by system- atics in data reduction, calibration and radial velocity measurements. Moreover these data has never been exploited to get other informa- tions like temperature and metallicity of stars, because considered too noisy for these kind of analyses. In this thesis we focus our attention on data reduction and analysis of spectra with very low SNR. The dataset we analyze in this thesis comprises 7250 spectra for 2771 stars of the Globular Cluster M 4 (NGC 6121) in the wavelength region 5145-5360Å obtained with GIRAFFE. Stars from the upper Red Giant Branch down to the Main Sequence have been observed in very different conditions, including nights close to full moon, and reaching SNR - 10 for many spectra in the dataset. We will first review the basic steps of data reduction and spec- tral extraction, adapting techniques well tested in other field (like photometry) but still under-developed in spectroscopy. We improve the wavelength dispersion solution and the correction of radial veloc- ity shift between day-time calibrations and science observations by following a completely different approach with respect to the ESO pipeline. We then analyze deeply the best way to perform sky sub- traction and continuum normalization, the most important sources respectively of noise and systematics in radial velocity determination and chemical analysis of spectra. The huge number of spectra of our dataset requires an automatic but robust approach, which we do not fail to provide. We finally determine radial velocities for the stars in the sample with unprecedented precision with respect to previous works with similar data and we recover the same stellar atmosphere parameters of other studies performed on the same cluster but on brighter stars, with higher spectral resolution and wavelength range ten times larger than our data. In the final chapter of the thesis we face a similar problem but from a completely different perspective. High resolution, high SNR data from the High Accuracy Radial Velocity Planet Searcher spectro- graph (HARPS) in La Silla (Chile) have been used to calibrate the at- mospheric stellar parameters as functions of the main characteristics of Cross-Correlation Functions, specifically built by including spec- tral lines with different sensitivity to stellar atmosphere parameters. These tools has been designed to be quick and to be easy to imple- ment in a instrument pipeline for a real-time determination, neverthe- less they provide accurate parameters even for lower SNR spectra.

  3. Tuition Reductions: A Quantitative Analysis of the Prevalence, Circumstances and Outcomes of an Emerging Pricing Strategy in Higher Education

    ERIC Educational Resources Information Center

    Kottich, Sarah

    2017-01-01

    This study analyzed tuition reductions in the private not-for-profit sector of higher education, utilizing a quantitative descriptive and correlational approach with secondary data analysis. It resulted in a listing of 45 institutions with verified tuition reductions from 2007 to 2017, more than previously thought. It found that the…

  4. Implications of bed reduction in an acute psychiatric service.

    PubMed

    Bastiampillai, Tarun J; Bidargaddi, Niranjan P; Dhillon, Rohan S; Schrader, Geoffrey D; Strobel, Jörg E; Galley, Philip J

    2010-10-04

    To evaluate the impact of psychiatric inpatient bed closures, accompanied by a training program aimed at enhancing team effectiveness and incorporating data-driven practices, in a mental health service. Retrospective comparison of the changes in services within three consecutive financial years: baseline period - before bed reduction (2006-07); observation period - after bed reduction (2007-08); and intervention period - second year after bed reduction (2008-09). The study was conducted at Cramond Clinic, Queen Elizabeth Hospital, Adelaide. Length of stay, 28-day readmission rates, discharges, bed occupancy rates, emergency department (ED) presentations, ED waiting time, seclusions, locality of treatment, and follow-up in the community within 7days. Reduced bed numbers were associated with reduced length of stay, fewer referrals from the community and subsequently shorter waiting times in the ED, without significant change in readmission rates. A higher proportion of patients was treated in the local catchment area, with improved community follow-up and a significant reduction in inpatient seclusions. Our findings should reassure clinicians concerned about psychiatric bed numbers that service redesign with planned bed reductions will not necessarily affect clinical care, provided data literacy and team training programs are in place to ensure smooth transition of patients across ED, inpatient and community services.

  5. Victorian Audit of Surgical Mortality is associated with improved clinical outcomes.

    PubMed

    Beiles, C Barry; Retegan, Claudia; Maddern, Guy J

    2015-11-01

    Improved outcomes are desirable results of clinical audit. The aim of this study was to use data from the Victorian Audit of Surgical Mortality (VASM) and the Victorian Admitted Episodes Dataset (VAED) to highlight specific areas of clinical improvement and reduction in mortality over the duration of the audit process. This study used retrospective, observational data from VASM and VAED. VASM data were reported by participating public and private health services, the Coroner and self-reporting surgeons across Victoria. Aggregated VAED data were supplied by the Victorian Department of Health. Assessment of outcomes was performed using chi-squared trend analysis over successive annual audit periods. Because initial collection of data was incomplete in the recruitment phase, statistical analysis was confined to the last 3-year period, 2010-2013. A 20% reduction in surgical mortality over the past 5 years has been identified from the VAED data. Progressive increase in both surgeon and hospital participation, significant reduction in both errors in management as perceived by assessors and increased direct consultant involvement in cases returned to theatre have been documented. The benefits of VASM are reflected in the association with a reduction of mortality and adverse clinical outcomes, which have clinical and financial benefits. It is a purely educational exercise and continued participation in this audit will ensure the highest standards of surgical care in Australia. This also highlights the valuable collaboration between the Victorian Department of Health and the RACS. © 2014 Royal Australasian College of Surgeons.

  6. SMURF: SubMillimeter User Reduction Facility

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Chapin, Edward L.; Berry, David S.; Gibb, Andy G.; Tilanus, Remo P. J.; Balfour, Jennifer; Tilanus, Vincent; Currie, Malcolm J.

    2013-10-01

    SMURF reduces submillimeter single-dish continuum and heterodyne data. It is mainly targeted at data produced by the James Clerk Maxwell Telescope but data from other telescopes have been reduced using the package. SMURF is released as part of the bundle that comprises Starlink (ascl:1110.012) and most of the packages that use it. The two key commands are MAKEMAP for the creation of maps from sub millimeter continuum data and MAKECUBE for the creation of data cubes from heterodyne array instruments. The software can also convert data from legacy JCMT file formats to the modern form to allow it to be processed by MAKECUBE. SMURF is a core component of the ORAC-DR (ascl:1310.001) data reduction pipeline for JCMT.

  7. Coupled dimensionality reduction and classification for supervised and semi-supervised multilabel learning

    PubMed Central

    Gönen, Mehmet

    2014-01-01

    Coupled training of dimensionality reduction and classification is proposed previously to improve the prediction performance for single-label problems. Following this line of research, in this paper, we first introduce a novel Bayesian method that combines linear dimensionality reduction with linear binary classification for supervised multilabel learning and present a deterministic variational approximation algorithm to learn the proposed probabilistic model. We then extend the proposed method to find intrinsic dimensionality of the projected subspace using automatic relevance determination and to handle semi-supervised learning using a low-density assumption. We perform supervised learning experiments on four benchmark multilabel learning data sets by comparing our method with baseline linear dimensionality reduction algorithms. These experiments show that the proposed approach achieves good performance values in terms of hamming loss, average AUC, macro F1, and micro F1 on held-out test data. The low-dimensional embeddings obtained by our method are also very useful for exploratory data analysis. We also show the effectiveness of our approach in finding intrinsic subspace dimensionality and semi-supervised learning tasks. PMID:24532862

  8. Coupled dimensionality reduction and classification for supervised and semi-supervised multilabel learning.

    PubMed

    Gönen, Mehmet

    2014-03-01

    Coupled training of dimensionality reduction and classification is proposed previously to improve the prediction performance for single-label problems. Following this line of research, in this paper, we first introduce a novel Bayesian method that combines linear dimensionality reduction with linear binary classification for supervised multilabel learning and present a deterministic variational approximation algorithm to learn the proposed probabilistic model. We then extend the proposed method to find intrinsic dimensionality of the projected subspace using automatic relevance determination and to handle semi-supervised learning using a low-density assumption. We perform supervised learning experiments on four benchmark multilabel learning data sets by comparing our method with baseline linear dimensionality reduction algorithms. These experiments show that the proposed approach achieves good performance values in terms of hamming loss, average AUC, macro F 1 , and micro F 1 on held-out test data. The low-dimensional embeddings obtained by our method are also very useful for exploratory data analysis. We also show the effectiveness of our approach in finding intrinsic subspace dimensionality and semi-supervised learning tasks.

  9. Impact of changes in blood pressure during the treatment of acute decompensated heart failure on renal and clinical outcomes†

    PubMed Central

    Testani, Jeffrey M.; Coca, Steven G.; McCauley, Brian D.; Shannon, Richard P.; Kimmel, Stephen E.

    2011-01-01

    Aims One of the primary determinants of blood flow in regional vascular beds is perfusion pressure. Our aim was to investigate if reduction in blood pressure during the treatment of decompensated heart failure would be associated with worsening renal function (WRF). Our secondary aim was to evaluate the prognostic significance of this potentially treatment-induced form of WRF. Methods and results Subjects included in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial limited data were studied (386 patients). Reduction in systolic blood pressure (SBP) was greater in patients experiencing WRF (−10.3 ± 18.5 vs. −2.8 ± 16.0 mmHg, P < 0.001) with larger reductions associated with greater odds for WRF (odds ratio = 1.3 per 10 mmHg reduction, P < 0.001). Systolic blood pressure reduction (relative change > median) was associated with greater doses of in-hospital oral vasodilators (P ≤ 0.017), thiazide diuretic use (P = 0.035), and greater weight reduction (P = 0.023). In patients with SBP-reduction, WRF was not associated with worsened survival [adjusted hazard ratio (HR) = 0.76, P = 0.58]. However, in patients without SBP-reduction, WRF was strongly associated with increased mortality (adjusted HR = 5.3, P < 0.001, P interaction = 0.001). Conclusion During the treatment of decompensated heart failure, significant blood pressure reduction is strongly associated with WRF. However, WRF that occurs in the setting of SBP-reduction is not associated with an adverse prognosis, whereas WRF in the absence of this provocation is strongly associated with increased mortality. These data suggest that WRF may represent the final common pathway of several mechanistically distinct processes, each with potentially different prognostic implications. PMID:21693504

  10. Impact of changes in blood pressure during the treatment of acute decompensated heart failure on renal and clinical outcomes.

    PubMed

    Testani, Jeffrey M; Coca, Steven G; McCauley, Brian D; Shannon, Richard P; Kimmel, Stephen E

    2011-08-01

    One of the primary determinants of blood flow in regional vascular beds is perfusion pressure. Our aim was to investigate if reduction in blood pressure during the treatment of decompensated heart failure would be associated with worsening renal function (WRF). Our secondary aim was to evaluate the prognostic significance of this potentially treatment-induced form of WRF. Subjects included in the Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness (ESCAPE) trial limited data were studied (386 patients). Reduction in systolic blood pressure (SBP) was greater in patients experiencing WRF (-10.3 ± 18.5 vs. -2.8 ± 16.0 mmHg, P < 0.001) with larger reductions associated with greater odds for WRF (odds ratio = 1.3 per 10 mmHg reduction, P < 0.001). Systolic blood pressure reduction (relative change > median) was associated with greater doses of in-hospital oral vasodilators (P ≤ 0.017), thiazide diuretic use (P = 0.035), and greater weight reduction (P = 0.023). In patients with SBP-reduction, WRF was not associated with worsened survival [adjusted hazard ratio (HR) = 0.76, P = 0.58]. However, in patients without SBP-reduction, WRF was strongly associated with increased mortality (adjusted HR = 5.3, P < 0.001, P interaction = 0.001). During the treatment of decompensated heart failure, significant blood pressure reduction is strongly associated with WRF. However, WRF that occurs in the setting of SBP-reduction is not associated with an adverse prognosis, whereas WRF in the absence of this provocation is strongly associated with increased mortality. These data suggest that WRF may represent the final common pathway of several mechanistically distinct processes, each with potentially different prognostic implications.

  11. Iterative deblending of simultaneous-source data using a coherency-pass shaping operator

    NASA Astrophysics Data System (ADS)

    Zu, Shaohuan; Zhou, Hui; Mao, Weijian; Zhang, Dong; Li, Chao; Pan, Xiao; Chen, Yangkang

    2017-10-01

    Simultaneous-source acquisition helps greatly boost an economic saving, while it brings an unprecedented challenge of removing the crosstalk interference in the recorded seismic data. In this paper, we propose a novel iterative method to separate the simultaneous source data based on a coherency-pass shaping operator. The coherency-pass filter is used to constrain the model, that is, the unblended data to be estimated, in the shaping regularization framework. In the simultaneous source survey, the incoherent interference from adjacent shots greatly increases the rank of the frequency domain Hankel matrix that is formed from the blended record. Thus, the method based on rank reduction is capable of separating the blended record to some extent. However, the shortcoming is that it may cause residual noise when there is strong blending interference. We propose to cascade the rank reduction and thresholding operators to deal with this issue. In the initial iterations, we adopt a small rank to severely separate the blended interference and a large thresholding value as strong constraints to remove the residual noise in the time domain. In the later iterations, since more and more events have been recovered, we weaken the constraint by increasing the rank and shrinking the threshold to recover weak events and to guarantee the convergence. In this way, the combined rank reduction and thresholding strategy acts as a coherency-pass filter, which only passes the coherent high-amplitude component after rank reduction instead of passing both signal and noise in traditional rank reduction based approaches. Two synthetic examples are tested to demonstrate the performance of the proposed method. In addition, the application on two field data sets (common receiver gathers and stacked profiles) further validate the effectiveness of the proposed method.

  12. Achieving the WHO sodium target: estimation of reductions required in the sodium content of packaged foods and other sources of dietary sodium.

    PubMed

    Eyles, Helen; Shields, Emma; Webster, Jacqui; Ni Mhurchu, Cliona

    2016-08-01

    Excess sodium intake is one of the top 2 dietary risk factors contributing to the global burden of disease. As such, many countries are now developing national sodium reduction strategies, a key component of which is a sodium reduction model that includes sodium targets for packaged foods and other sources of dietary sodium. We sought to develop a sodium reduction model to determine the reductions required in the sodium content of packaged foods and other dietary sources of sodium to reduce adult population salt intake by ∼30% toward the optimal WHO target of 5 g/d. Nationally representative household food-purchasing data for New Zealand were linked with branded food composition information to determine the mean contribution of major packaged food categories to total population sodium consumption. Discretionary salt use and the contribution of sodium from fresh foods and foods consumed away from the home were estimated with the use of national nutrition survey data. Reductions required in the sodium content of packaged foods and other dietary sources of sodium to achieve a 30% reduction in dietary sodium intakes were estimated. A 36% reduction (1.6 g salt or 628 mg Na) in the sodium content of packaged foods in conjunction with a 40% reduction in discretionary salt use and the sodium content of foods consumed away from the home would reduce total population salt intake in New Zealand by 35% (from 8.4 to 5.5 g/d) and thus meet the WHO 2025 30% relative reduction target. Key reductions required include a decrease of 21% in the sodium content of white bread, 27% for hard cheese, 42% for sausages, and 54% for ready-to-eat breakfast cereals. Achieving the WHO sodium target in New Zealand will take considerable efforts by both food manufacturers and consumers and will likely require a national government-led sodium reduction strategy. © 2016 American Society for Nutrition.

  13. Optimized deployment of emission reduction technologies for large fleets.

    DOT National Transportation Integrated Search

    2011-06-01

    This research study produced an optimization framework for determining the most efficient emission : reduction strategies among vehicles and equipment in a large fleet. The Texas Department of : Transportations (TxDOTs) fleet data were utilized...

  14. Worksite trip reduction model and manual

    DOT National Transportation Integrated Search

    2004-04-01

    According to Institute of Transportation Engineers, assessing the trip reduction claims from transportation demand management (TDM) programs is an issue for estimating future traffic volumes from trip generation data. To help assess those claims, a W...

  15. A Concept for the One Degree Imager (ODI) Data Reduction Pipeline and Archiving System

    NASA Astrophysics Data System (ADS)

    Knezek, Patricia; Stobie, B.; Michael, S.; Valdes, F.; Marru, S.; Henschel, R.; Pierce, M.

    2010-05-01

    The One Degree Imager (ODI), currently being built by the WIYN Observatory, will provide tremendous possibilities for conducting diverse scientific programs. ODI will be a complex instrument, using non-conventional Orthogonal Transfer Array (OTA) detectors. Due to its large field of view, small pixel size, use of OTA technology, and expected frequent use, ODI will produce vast amounts of astronomical data. If ODI is to achieve its full potential, a data reduction pipeline must be developed. Long-term archiving must also be incorporated into the pipeline system to ensure the continued value of ODI data. This paper presents a concept for an ODI data reduction pipeline and archiving system. To limit costs and development time, our plan leverages existing software and hardware, including existing pipeline software, Science Gateways, Computational Grid & Cloud Technology, Indiana University's Data Capacitor and Massive Data Storage System, and TeraGrid compute resources. Existing pipeline software will be augmented to add functionality required to meet challenges specific to ODI, enhance end-user control, and enable the execution of the pipeline on grid resources including national grid resources such as the TeraGrid and Open Science Grid. The planned system offers consistent standard reductions and end-user flexibility when working with images beyond the initial instrument signature removal. It also gives end-users access to computational and storage resources far beyond what are typically available at most institutions. Overall, the proposed system provides a wide array of software tools and the necessary hardware resources to use them effectively.

  16. Regional oxygen reduction and denitrification rates in groundwater from multi-model residence time distributions, San Joaquin Valley, USA

    USGS Publications Warehouse

    Green, Christopher T.; Jurgens, Bryant; Zhang, Yong; Starn, Jeffrey; Singleton, Michael J.; Esser, Bradley K.

    2016-01-01

    Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O2 reduction and denitrification (NO3− reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwater age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF6, CFCs, 3H, He from 3H (tritiogenic He),14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO3− and dissolved gas data to estimate zero order and first order rates of O2 reduction and denitrification. Results indicated that O2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O2 and NO3− reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O2 reduction rates. Estimated historical NO3− trends were similar to historical measurements. Results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.

  17. Regional oxygen reduction and denitrification rates in groundwater from multi-model residence time distributions, San Joaquin Valley, USA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Christopher T.; Jurgens, Bryant C.; Zhang, Yong

    Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O 2 reduction and denitrification (NO 3 – reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwatermore » age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF 6, CFCs, 3H, He from 3H (tritiogenic He), 14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO 3 – and dissolved gas data to estimate zero order and first order rates of O 2 reduction and denitrification. Results indicated that O 2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O 2 and NO 3 – reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O 2 reduction rates. Estimated historical NO 3 – trends were similar to historical measurements. Here, results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O 2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.« less

  18. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada.

    PubMed

    Joffres, Michel R; Campbell, Norm R C; Manns, Braden; Tu, Karen

    2007-05-01

    Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada.

  19. Estimate of the benefits of a population-based reduction in dietary sodium additives on hypertension and its related health care costs in Canada

    PubMed Central

    Joffres, Michel R; Campbell, Norm RC; Manns, Braden; Tu, Karen

    2007-01-01

    BACKGROUND: Hypertension is the leading risk factor for mortality worldwide. One-quarter of the adult Canadian population has hypertension, and more than 90% of the population is estimated to develop hypertension if they live an average lifespan. Reductions in dietary sodium additives significantly lower systolic and diastolic blood pressure, and population reductions in dietary sodium are recommended by major scientific and public health organizations. OBJECTIVES: To estimate the reduction in hypertension prevalence and specific hypertension management cost savings associated with a population-wide reduction in dietary sodium additives. METHODS: Based on data from clinical trials, reducing dietary sodium additives by 1840 mg/day would result in a decrease of 5.06 mmHg (systolic) and 2.7 mmHg (diastolic) blood pressures. Using Canadian Heart Health Survey data, the resulting reduction in hypertension was estimated. Costs of laboratory testing and physician visits were based on 2001 to 2003 Ontario Health Insurance Plan data, and the number of physician visits and costs of medications for patients with hypertension were taken from 2003 IMS Canada. To estimate the reduction in total physician visits and laboratory costs, current estimates of aware hypertensive patients in Canada were used from the Canadian Community Health Survey. RESULTS: Reducing dietary sodium additives may decrease hypertension prevalence by 30%, resulting in one million fewer hypertensive patients in Canada, and almost double the treatment and control rate. Direct cost savings related to fewer physician visits, laboratory tests and lower medication use are estimated to be approximately $430 million per year. Physician visits and laboratory costs would decrease by 6.5%, and 23% fewer treated hypertensive patients would require medications for control of blood pressure. CONCLUSIONS: Based on these estimates, lowering dietary sodium additives would lead to a large reduction in hypertension prevalence and result in health care cost savings in Canada. PMID:17487286

  20. Regional oxygen reduction and denitrification rates in groundwater from multi-model residence time distributions, San Joaquin Valley, USA

    DOE PAGES

    Green, Christopher T.; Jurgens, Bryant C.; Zhang, Yong; ...

    2016-05-14

    Rates of oxygen and nitrate reduction are key factors in determining the chemical evolution of groundwater. Little is known about how these rates vary and covary in regional groundwater settings, as few studies have focused on regional datasets with multiple tracers and methods of analysis that account for effects of mixed residence times on apparent reaction rates. This study provides insight into the characteristics of residence times and rates of O 2 reduction and denitrification (NO 3 – reduction) by comparing reaction rates using multi-model analytical residence time distributions (RTDs) applied to a data set of atmospheric tracers of groundwatermore » age and geochemical data from 141 well samples in the Central Eastern San Joaquin Valley, CA. The RTD approach accounts for mixtures of residence times in a single sample to provide estimates of in-situ rates. Tracers included SF 6, CFCs, 3H, He from 3H (tritiogenic He), 14C, and terrigenic He. Parameter estimation and multi-model averaging were used to establish RTDs with lower error variances than those produced by individual RTD models. The set of multi-model RTDs was used in combination with NO 3 – and dissolved gas data to estimate zero order and first order rates of O 2 reduction and denitrification. Results indicated that O 2 reduction and denitrification rates followed approximately log-normal distributions. Rates of O 2 and NO 3 – reduction were correlated and, on an electron milliequivalent basis, denitrification rates tended to exceed O 2 reduction rates. Estimated historical NO 3 – trends were similar to historical measurements. Here, results show that the multi-model approach can improve estimation of age distributions, and that relatively easily measured O 2 rates can provide information about trends in denitrification rates, which are more difficult to estimate.« less

  1. Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique

    NASA Technical Reports Server (NTRS)

    Maise, G.; Rossi, M. J.

    1974-01-01

    A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.

  2. Mapping of power consumption and friction reduction in piezoelectrically-assisted ultrasonic lubrication

    NASA Astrophysics Data System (ADS)

    Dong, Sheng; Dapino, Marcelo J.

    2015-04-01

    Ultrasonic lubrication has been proven effective in reducing dynamic friction. This paper investigates the relationship between friction reduction, power consumption, linear velocity, and normal stress. A modified pin-on-disc tribometer was adopted as the experimental set-up, and a Labview system was utilized for signal generation and data acquisition. Friction reduction was quantified for 0.21 to 5.31 W of electric power, 50 to 200 mm/s of linear velocity, and 23 to 70 MPa of normal stress. Friction reduction near 100% can be achieved under certain conditions. Lower linear velocity and higher electric power result in greater friction reduction, while normal stress has little effect on friction reduction. Contour plots of friction reduction, power consumption, linear velocity, and normal stress were created. An efficiency coefficient was proposed to calculate power requirements for a certain friction reduction or reduced friction for a given electric power.

  3. Semi-closed reduction of tripod fractures of zygoma under intraoperative assessment using ultrasonography.

    PubMed

    Soejima, Kazutaka; Sakurai, Hiroyuki; Nozaki, Motohiro; Kitazawa, Yoshihiko; Takeuchi, Masaki; Yamaki, Takashi; Kono, Taro

    2009-04-01

    We conducted semi-closed reduction of isolated tripod fractures of the zygoma through only a brow incision under intraoperative assessment with ultrasonography. Twenty-three patients with unilateral, non-comminuted tripod fractures of zygoma were selected for application of this method at Tokyo Women's Medical University and Tokyo Metropolitan Hiroo General Hospital between April 2002 and April 2006. Patients with orbital floor blowout fractures were excluded. A skin incision was made only at the lateral brow region and the reduction was performed by inserting an elevator beneath the zygomatic arch. The bone alignment was intraoperatively assessed by ultrasonography. When the reduction was accurate, the frontozygomatic suture was immobilised with a mini-plate under direct visualisation and transmaler Kirshner wire fixation was performed. The accuracy of the reduction and postoperative movement were evaluated by computed tomography (CT) scans taken at 1 and 6 months. In five cases, the DICOM (Digital Imaging and Communication in Medicine) data from the CT were analysed with 3D imaging software (V-works, CyberMed Co., Korea). In all cases, accurate reduction was obtained. The analysis of the 3D imaging data revealed that postoperative movement of bone fragment was minimal. When the accurate reduction was obtained under intraoperative assessment, the semi-closed reduction and one-plate fixation with transmaler Kirshner wire is enough to treat the simple tripod fractures of zygoma. This method is minimally invasive and takes less operative time.

  4. Network-based de-noising improves prediction from microarray data.

    PubMed

    Kato, Tsuyoshi; Murata, Yukio; Miura, Koh; Asai, Kiyoshi; Horton, Paul B; Koji, Tsuda; Fujibuchi, Wataru

    2006-03-20

    Prediction of human cell response to anti-cancer drugs (compounds) from microarray data is a challenging problem, due to the noise properties of microarrays as well as the high variance of living cell responses to drugs. Hence there is a strong need for more practical and robust methods than standard methods for real-value prediction. We devised an extended version of the off-subspace noise-reduction (de-noising) method to incorporate heterogeneous network data such as sequence similarity or protein-protein interactions into a single framework. Using that method, we first de-noise the gene expression data for training and test data and also the drug-response data for training data. Then we predict the unknown responses of each drug from the de-noised input data. For ascertaining whether de-noising improves prediction or not, we carry out 12-fold cross-validation for assessment of the prediction performance. We use the Pearson's correlation coefficient between the true and predicted response values as the prediction performance. De-noising improves the prediction performance for 65% of drugs. Furthermore, we found that this noise reduction method is robust and effective even when a large amount of artificial noise is added to the input data. We found that our extended off-subspace noise-reduction method combining heterogeneous biological data is successful and quite useful to improve prediction of human cell cancer drug responses from microarray data.

  5. The Influence of Function, Topography, and Setting on Noncontingent Reinforcement Effect Sizes for Reduction in Problem Behavior: A Meta-Analysis of Single-Case Experimental Design Data

    ERIC Educational Resources Information Center

    Ritter, William A.; Barnard-Brak, Lucy; Richman, David M.; Grubb, Laura M.

    2018-01-01

    Richman et al. ("J Appl Behav Anal" 48:131-152, 2015) completed a meta-analytic analysis of single-case experimental design data on noncontingent reinforcement (NCR) for the treatment of problem behavior exhibited by individuals with developmental disabilities. Results showed that (1) NCR produced very large effect sizes for reduction in…

  6. Reduction of lithologic-log data to numbers for use in the digital computer

    USGS Publications Warehouse

    Morgan, C.O.; McNellis, J.M.

    1971-01-01

    The development of a standardized system for conveniently coding lithologic-log data for use in the digital computer has long been needed. The technique suggested involves a reduction of the original written alphanumeric log to a numeric log by use of computer programs. This numeric log can then be retrieved as a written log, interrogated for pertinent information, or analyzed statistically. ?? 1971 Plenum Publishing Corporation.

  7. The UIST image slicing integral field unit

    NASA Astrophysics Data System (ADS)

    Ramsay Howat, S.; Todd, S.; Wells, M.; Hastings, P.

    2006-06-01

    The UKIRT Imager Spectrometer (UIST) contains a deployable integral field unit which is one of the most popular modes of this common-user instrument. In this paper, we review all aspects of the UIST IFU from the design and production of the aluminium mirrors to the integration with the telescope system during commissioning. Reduction of the integral field data is fully supported by the UKIRT data reduction pipeline, ORAC-DR.

  8. Impact of Medicare payment reductions on access to surgical services.

    PubMed Central

    Mitchell, J B; Cromwell, J

    1995-01-01

    OBJECTIVE. This study evaluates the impact of surgical fee reductions under Medicare on the utilization of surgical services. DATA SOURCES. Medicare physician claims data were obtained from 11 states for a five-year time period (1985-1989). STUDY DESIGN. Under OBRA-87, Medicare reduced payments for 11 surgical procedures. A fixed effects regression method was used to determine the impact of these payment reductions on access to care for potentially vulnerable Medicare beneficiaries: joint Medicaid-eligibles, blacks, and the very old. DATA COLLECTION/EXTRACTION METHODS. Medicare claims and enrollment data were used to construct a cross-section time-series of population-based surgical rates from 1985 through 1989. PRINCIPAL FINDINGS. Reductions in surgical fees led to small but significant increases in use for three procedures, small decreases in use for two procedures, and no impact on the remaining six procedures. There was little evidence that access to surgery was impaired for potentially vulnerable enrollees; in fact, declining fees often led to greater rates of increases for some subgroups. CONCLUSIONS. Our results suggest that volume responses by surgeons to payment changes under the Medicare Fee Schedule may be smaller than HCFA's original estimates. Nevertheless, both access and quality of care should continue to be closely monitored. PMID:8537224

  9. The Gravity Probe B `Niobium bird' experiment: Verifying the data reduction scheme for estimating the relativistic precession of Earth-orbiting gyroscopes

    NASA Technical Reports Server (NTRS)

    Uemaatsu, Hirohiko; Parkinson, Bradford W.; Lockhart, James M.; Muhlfelder, Barry

    1993-01-01

    Gravity Probe B (GP-B) is a relatively gyroscope experiment begun at Stanford University in 1960 and supported by NASA since 1963. This experiment will check, for the first time, the relativistic precession of an Earth-orbiting gyroscope that was predicted by Einstein's General Theory of Relativity, to an accuracy of 1 milliarcsecond per year or better. A drag-free satellite will carry four gyroscopes in a polar orbit to observe their relativistic precession. The primary sensor for measuring the direction of gyroscope spin axis is the SQUID (superconducting quantum interference device) magnetometer. The data reduction scheme designed for the GP-B program processes the signal from the SQUID magnetometer and estimates the relativistic precession rates. We formulated the data reduction scheme and designed the Niobium bird experiment to verify the performance of the data reduction scheme experimentally with an actual SQUID magnetometer within the test loop. This paper reports the results from the first phase of the Niobium bird experiment, which used a commercially available SQUID magnetometer as its primary sensor, and adresses the issues they raised. The first phase resulted in a large, temperature-dependent bias drift in the insensitive design and a temperature regulation scheme.

  10. Structural parameters that influence the noise reduction characteristics of typical general aviation materials

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Grosveld, F.

    1980-01-01

    Effect of panel curvature and oblique angle of sound incidence on noise reduction characteristics of an aluminum panel are experimentally investigated. Panel curvature results show significant increase in stiffness with comparable decrease of sound transmission through the panel in the frequency region below the panel/cavity resonance frequency. Noise reduction data have been achieved for aluminum panels with clamped, bonded and riveted edge conditions. These edge conditions are shown to influence noise reduction characteristics of aluminum panels. Experimentally measured noise reduction characteristics of flat aluminum panels with uniaxial and biaxial in-plane stresses are presented and discussed. Results indicate important improvement in noise reduction of these panels in the frequency range below the fundamental panel/cavity resonance frequency.

  11. A toolbox for alleviating traffic congestion and enhancing mobility

    DOT National Transportation Integrated Search

    2012-06-01

    This report presents the Tolling Data Test Plan for the national evaluation of the Los Angeles (LA) Congestion Reduction Demonstration (Metro ExpressLanes) under the United States Department of Transportation (U.S. DOT) Congestion Reduction Demonstra...

  12. Alternative Fuels Data Center: Strategies to Conserve Fuel

    Science.gov Websites

    conserve fuel. Idle Reduction Idle Reduction Find ways to save fuel and money by idling less. Driving save money. Parts and Equipment Parts and Equipment Learn about outfitting your fleet's vehicles with

  13. Ion Association, Solubilities, and Reduction Potentials in Aqueous Solution.

    ERIC Educational Resources Information Center

    Russo, Steven O.; Hanania, George I. H.

    1989-01-01

    Incorporates the combined effects of ionic strength and ion association to show how calculations involving ionic equilibria are carried out. Examines the variability of reduction potential data for two aqueous redox systems. Provides several examples. (MVL)

  14. Share the road campaign research study : final report

    DOT National Transportation Integrated Search

    2012-07-01

    This report presents the Environmental Data Test Plan for the national evaluation of the Los Angeles (LA) Congestion Reduction Demonstration (Metro ExpressLanes) under the United States Department of Transportation (U.S. DOT) Congestion Reduction Dem...

  15. Cluster Correspondence Analysis.

    PubMed

    van de Velden, M; D'Enza, A Iodice; Palumbo, F

    2017-03-01

    A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.

  16. Exploring the CAESAR database using dimensionality reduction techniques

    NASA Astrophysics Data System (ADS)

    Mendoza-Schrock, Olga; Raymer, Michael L.

    2012-06-01

    The Civilian American and European Surface Anthropometry Resource (CAESAR) database containing over 40 anthropometric measurements on over 4000 humans has been extensively explored for pattern recognition and classification purposes using the raw, original data [1-4]. However, some of the anthropometric variables would be impossible to collect in an uncontrolled environment. Here, we explore the use of dimensionality reduction methods in concert with a variety of classification algorithms for gender classification using only those variables that are readily observable in an uncontrolled environment. Several dimensionality reduction techniques are employed to learn the underlining structure of the data. These techniques include linear projections such as the classical Principal Components Analysis (PCA) and non-linear (manifold learning) techniques, such as Diffusion Maps and the Isomap technique. This paper briefly describes all three techniques, and compares three different classifiers, Naïve Bayes, Adaboost, and Support Vector Machines (SVM), for gender classification in conjunction with each of these three dimensionality reduction approaches.

  17. Separate Flow Nozzle Test Status Meeting

    NASA Technical Reports Server (NTRS)

    Saiyed, Naseem H. (Editor)

    2000-01-01

    NASA Glenn, in partnership with US industry, completed an exhaustive experimental study on jet noise reduction from separate flow nozzle exhaust systems. The study developed a data base on various bypass ratio nozzles, screened quietest configurations and acquired pertinent data for predicting the plume behavior and ultimately its corresponding jet noise. Several exhaust system configurations provided over 2.5 EPNdB jet noise reduction at take-off power. These data were disseminated to US aerospace industry in a conference hosted by NASA GRC whose proceedings are shown in this report.

  18. Reduction of solar vector magnetograph data using a microMSP array processor

    NASA Technical Reports Server (NTRS)

    Kineke, Jack

    1990-01-01

    The processing of raw data obtained by the solar vector magnetograph at NASA-Marshall requires extensive arithmetic operations on large arrays of real numbers. The objectives of this summer faculty fellowship study are to: (1) learn the programming language of the MicroMSP Array Processor and adapt some existing data reduction routines to exploit its capabilities; and (2) identify other applications and/or existing programs which lend themselves to array processor utilization which can be developed by undergraduate student programmers under the provisions of project JOVE.

  19. GIXSGUI : a MATLAB toolbox for grazing-incidence X-ray scattering data visualization and reduction, and indexing of buried three-dimensional periodic nanostructured films

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang

    GIXSGUIis a MATLAB toolbox that offers both a graphical user interface and script-based access to visualize and process grazing-incidence X-ray scattering data from nanostructures on surfaces and in thin films. It provides routine surface scattering data reduction methods such as geometric correction, one-dimensional intensity linecut, two-dimensional intensity reshapingetc. Three-dimensional indexing is also implemented to determine the space group and lattice parameters of buried organized nanoscopic structures in supported thin films.

  20. The observational basis for JPL's DE 200, the planetary ephemerides of the Astronomical Almanac

    NASA Astrophysics Data System (ADS)

    Standish, E. M., Jr.

    1990-07-01

    This paper documents the planetary observational data used in a series of ephemerides produced at JPL over six years preceding the creation of DE118/LE62, the set which transformed directly into the JD2000-based set, DE200/LE200. Details of the data reduction procedures are presented, and techniques to overcome the uncertainties due to planetary topography are described. For the spacecraft data, the basic reductions are augmented by formulations for locating the transponder, whether in orbit or landed on the surface of a planet.

  1. ORAC-DR -- imaging data reduction

    NASA Astrophysics Data System (ADS)

    Currie, Malcolm J.; Cavanagh, Brad

    ORAC-DR is a general-purpose automatic data-reduction pipeline environment. This document describes its use to reduce imaging data collected at the United Kingdom Infrared Telescope (UKIRT) with the UFTI, UIST, IRCAM, and Michelle instruments; at the Anglo-Australian Telescope (AAT) with the IRIS2 instrument; at the Very Large Telescope with ISAAC and NACO; from Magellan's Classic Cam, at Gemini with NIRI, and from the Isaac Newton Group using INGRID. It outlines the algorithms used and how to make minor modifications to them, and how to correct for errors made at the telescope.

  2. ORAC-DR: Overview and General Introduction

    NASA Astrophysics Data System (ADS)

    Economou, Frossie; Jenness, Tim; Currie, Malcolm J.; Adamson, Andy; Allan, Alasdair; Cavanagh, Brad

    ORAC-DR is a general purpose automatic data reduction pipeline environment. It currently supports data reduction for the United Kingdom Infrared Telescope (UKIRT) instruments UFTI, IRCAM, UIST and CGS4, for the James Clerk Maxwell Telescope (JCMT) instrument SCUBA, for the William Herschel Telescope (WHT) instrument INGRID, for the European Southern Observatory (ESO) instrument ISAAC and for the Anglo-Australian Telescope (AAT) instrument IRIS-2. This document describes the general pipeline environment. For specific information on how to reduce the data for a particular instrument, please consult the appropriate ORAC-DR instrument guide.

  3. An estimating equation approach to dimension reduction for longitudinal data

    PubMed Central

    Xu, Kelin; Guo, Wensheng; Xiong, Momiao; Zhu, Liping; Jin, Li

    2016-01-01

    Sufficient dimension reduction has been extensively explored in the context of independent and identically distributed data. In this article we generalize sufficient dimension reduction to longitudinal data and propose an estimating equation approach to estimating the central mean subspace. The proposed method accounts for the covariance structure within each subject and improves estimation efficiency when the covariance structure is correctly specified. Even if the covariance structure is misspecified, our estimator remains consistent. In addition, our method relaxes distributional assumptions on the covariates and is doubly robust. To determine the structural dimension of the central mean subspace, we propose a Bayesian-type information criterion. We show that the estimated structural dimension is consistent and that the estimated basis directions are root-\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$n$\\end{document} consistent, asymptotically normal and locally efficient. Simulations and an analysis of the Framingham Heart Study data confirm the effectiveness of our approach. PMID:27017956

  4. Data reduction of isotope-resolved LC-MS spectra.

    PubMed

    Du, Peicheng; Sudha, Rajagopalan; Prystowsky, Michael B; Angeletti, Ruth Hogue

    2007-06-01

    Data reduction of liquid chromatography-mass spectrometry (LC-MS) spectra can be a challenge due to the inherent complexity of biological samples, noise and non-flat baseline. We present a new algorithm, LCMS-2D, for reliable data reduction of LC-MS proteomics data. LCMS-2D can reliably reduce LC-MS spectra with multiple scans to a list of elution peaks, and subsequently to a list of peptide masses. It is capable of noise removal, and deconvoluting peaks that overlap in m/z, in retention time, or both, by using a novel iterative peak-picking step, a 'rescue' step, and a modified variable selection method. LCMS-2D performs well with three sets of annotated LC-MS spectra, yielding results that are better than those from PepList, msInspect and the vendor software BioAnalyst. The software LCMS-2D is available under the GNU general public license from http://www.bioc.aecom.yu.edu/labs/angellab/as a standalone C program running on LINUX.

  5. Clean Cities 2012 Annual Metrics Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Caley

    2013-12-01

    The U.S. Department of Energy's (DOE) Clean Cities program advances the nation's economic, environmental, and energy security by supporting local actions to cut petroleum use in transportation. A national network of nearly 100 Clean Cities coalitions brings together stakeholders in the public and private sectors to deploy alternative and renewable fuels, idle-reduction measures, fuel economy improvements, and new transportation technologies, as they emerge. Each year DOE asks Clean Cities coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of themore » Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterizes the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction initiatives, fuel economy activities, and programs to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this report.« less

  6. A system architecture for online data interpretation and reduction in fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Röder, Thorsten; Geisbauer, Matthias; Chen, Yang; Knoll, Alois; Uhl, Rainer

    2010-01-01

    In this paper we present a high-throughput sample screening system that enables real-time data analysis and reduction for live cell analysis using fluorescence microscopy. We propose a novel system architecture capable of analyzing a large amount of samples during the experiment and thus greatly minimizing the post-analysis phase that is the common practice today. By utilizing data reduction algorithms, relevant information of the target cells is extracted from the online collected data stream, and then used to adjust the experiment parameters in real-time, allowing the system to dynamically react on changing sample properties and to control the microscope setup accordingly. The proposed system consists of an integrated DSP-FPGA hybrid solution to ensure the required real-time constraints, to execute efficiently the underlying computer vision algorithms and to close the perception-action loop. We demonstrate our approach by addressing the selective imaging of cells with a particular combination of markers. With this novel closed-loop system the amount of superfluous collected data is minimized, while at the same time the information entropy increases.

  7. Concepts and procedures required for successful reduction of tensor magnetic gradiometer data obtained from an unexploded ordnance detection demonstration at Yuma Proving Grounds, Arizona

    USGS Publications Warehouse

    Bracken, Robert E.; Brown, Philip J.

    2006-01-01

    On March 12, 2003, data were gathered at Yuma Proving Grounds, in Arizona, using a Tensor Magnetic Gradiometer System (TMGS). This report shows how these data were processed and explains concepts required for successful TMGS data reduction. Important concepts discussed include extreme attitudinal sensitivity of vector measurements, low attitudinal sensitivity of gradient measurements, leakage of the common-mode field into gradient measurements, consequences of thermal drift, and effects of field curvature. Spatial-data collection procedures and a spin-calibration method are addressed. Discussions of data-reduction procedures include tracking of axial data by mathematically matching transfer functions among the axes, derivation and application of calibration coefficients, calculation of sensor-pair gradients, thermal-drift corrections, and gradient collocation. For presentation, the magnetic tensor at each data station is converted to a scalar quantity, the I2 tensor invariant, which is easily found by calculating the determinant of the tensor. At important processing junctures, the determinants for all stations in the mapped area are shown in shaded relief map-view. Final processed results are compared to a mathematical model to show the validity of the assumptions made during processing and the reasonableness of the ultimate answer obtained.

  8. Reduction and analysis of data from the plasma wave instruments on the IMP-6 and IMP-8 spacecraft

    NASA Technical Reports Server (NTRS)

    Gurnett, D. A.; Anderson, R. R.

    1983-01-01

    The primary data reduction effort during the reporting period was to process summary plots of the IMP 8 plasma wave data and to submit these data to the National Space Science Data Center. Features of the electrostatic noise are compared with simultaneous observations of the magnetic field, plasma and energetic electrons. Spectral characteristics of the noise and the results of this comparison both suggest that in its high frequency part at least the noise does not belong to normal modes of plasma waves but represents either quasi-thermal noise in the non-Maxwellian plasma or artificial noise generated by spacecraft interaction with the medium.

  9. The anaerobic degradation of organic matter in Danish coastal sediments - Iron reduction, manganese reduction, and sulfate reduction

    NASA Technical Reports Server (NTRS)

    Canfield, Donald E.; Thamdrup, BO; Hansen, Jens W.

    1993-01-01

    A combination of porewater and solid phase analysis as well as a series of sediment incubations are used to quantify organic carbon oxidation by dissimilatory Fe reduction, Mn reduction, and sulfate reduction, in sediments from the Skagerrak (located off the northeast coast of Jutland, Denmark). Solid phase data are integrated with incubation results to define the zones of the various oxidation processes. At S(9), surface Mn enrichments of up to 3.5 wt pct were found, and with such a ready source of Mn, dissimilatory Mn reduction was the only significant anaerobic process of carbon oxidation in the surface 10 cm of the sediment. At S(4) and S(6), active Mn reduction occurred; however, most of the Mn reduction may have resulted from the oxidation of acid volatile sulfides and Fe(2+) rather than by a dissimilatory sulfate. Dissolved Mn(2+) was found to completely adsorb onto sediment containing fully oxidized Mn oxides.

  10. Impact of the UK voluntary sodium reduction targets on the sodium content of processed foods from 2006 to 2011: analysis of household consumer panel data.

    PubMed

    Eyles, Helen; Webster, Jacqueline; Jebb, Susan; Capelin, Cathy; Neal, Bruce; Ni Mhurchu, Cliona

    2013-11-01

    In 2006 the UK Food Standards Agency (FSA) introduced voluntary sodium reduction targets for more than 80 categories of processed food. Our aim was to determine the impact of these targets on the sodium content of processed foods in the UK between 2006 and 2011. Household consumer panel data (n>18,000 households) were used to calculate crude and sales-weighted mean sodium content for 47,337 products in 2006 and 49,714 products in 2011. Two sample t-tests were used to compare means. A secondary analysis was undertaken to explore reformulation efforts and included only products available for sale in both 2006 and 2011. Between 2006 and 2011 there was an overall mean reduction in crude sodium content of UK foods of -26 mg/100g (p ≤ 0.001), equivalent to a 7% fall (356 mg/100g to 330 mg/100g). The corresponding sales-weighted reduction was -21 mg/100g (-6%). For products available for sale in both years the corresponding reduction was -23 mg/100g (p<0.001) or -7%. The UK FSA voluntary targets delivered a moderate reduction in the mean sodium content of UK processed foods between 2006 and 2011. Whilst encouraging, regular monitoring and review of the UK sodium reduction strategy will be essential to ensure continued progress. © 2013.

  11. Sodium intake status in United States and potential reduction modeling: an NHANES 2007-2010 analysis.

    PubMed

    Agarwal, Sanjiv; Fulgoni, Victor L; Spence, Lisa; Samuel, Priscilla

    2015-11-01

    Limiting dietary sodium intake has been a consistent dietary recommendation. Using NHANES 2007-2010 data, we estimated current sodium intake and modeled the potential impact of a new sodium reduction technology on sodium intake. NHANES 2007-2010 data were used to assess current sodium intake. The National Cancer Institute method was used for usual intake determination. Suggested sodium reductions using SODA-LO (®) Salt Microspheres ranged from 20% to 30% in 953 foods and usual intakes were modeled by using various reduction factors and levels of market penetration. SAS 9.2, SUDAAN 11, and NHANES survey weights were used in all calculations with assessment across gender and age groups. Current (2007-2010) sodium intake (mg/day) exceeds recommendations across all age gender groups and has not changed during the last decade. However, sodium intake measured as a function of food intake (mg/g food) has decreased significantly during the last decade. Two food categories contribute about 2/3rd of total sodium intake: "Grain Products" and "Meat, Poultry, Fish & Mixtures". Sodium reduction, with 100% market penetration of the new technology, was estimated to be 230-300 mg/day or 7-9% of intake depending upon age and gender group. Sodium reduction innovations like SODA-LO (®) Salt Microspheres could contribute to meaningful reductions in sodium intake.

  12. Lessons Learned From Community-Based Approaches to Sodium Reduction

    PubMed Central

    Kane, Heather; Strazza, Karen; Losby PhD, Jan L.; Lane, Rashon; Mugavero, Kristy; Anater, Andrea S.; Frost, Corey; Margolis, Marjorie; Hersey, James

    2017-01-01

    Purpose This article describes lessons from a Centers for Disease Control and Prevention initiative encompassing sodium reduction interventions in six communities. Design A multiple case study design was used. Setting This evaluation examined data from programs implemented in six communities located in New York (Broome County, Schenectady County, and New York City); California (Los Angeles County and Shasta County); and Kansas (Shawnee County). Subjects Participants (n = 80) included program staff, program directors, state-level staff, and partners. Measures Measures for this evaluation included challenges, facilitators, and lessons learned from implementing sodium reduction strategies. Analysis The project team conducted a document review of program materials and semi structured interviews 12 to 14 months after implementation. The team coded and analyzed data deductively and inductively. Results Five lessons for implementing community-based sodium reduction approaches emerged: (1) build relationships with partners to understand their concerns, (2) involve individuals knowledgeable about specific venues early, (3) incorporate sodium reduction efforts and messaging into broader nutrition efforts, (4) design the program to reduce sodium gradually to take into account consumer preferences and taste transitions, and (5) identify ways to address the cost of lower-sodium products. Conclusion The experiences of the six communities may assist practitioners in planning community-based sodium reduction interventions. Addressing sodium reduction using a community-based approach can foster meaningful change in dietary sodium consumption. PMID:24575726

  13. A data reduction, management, and analysis system for a 10-terabyte data set

    NASA Technical Reports Server (NTRS)

    DeMajistre, R.; Suther, L.

    1995-01-01

    Within 12 months a 5-year space-based research investigation with an estimated daily data volume of 10 to 15 gigabytes will be launched. Our instrument/analysis team will analyze 2 to 8 gigabytes per day from this mission. Most of these data will be spatial and multispectral collected from nine sensors covering the UV/Visible/NlR spectrum. The volume and diversity of these data and the nature of its analysis require a very robust reduction and management system. This paper is a summary of the systems requirements and a high-level description of a solution. The paper is intended as a case study of the problems and potential solutions faced by the new generation of Earth observation data support systems.

  14. Data on Support Vector Machines (SVM) model to forecast photovoltaic power.

    PubMed

    Malvoni, M; De Giorgi, M G; Congedo, P M

    2016-12-01

    The data concern the photovoltaic (PV) power, forecasted by a hybrid model that considers weather variations and applies a technique to reduce the input data size, as presented in the paper entitled "Photovoltaic forecast based on hybrid pca-lssvm using dimensionality reducted data" (M. Malvoni, M.G. De Giorgi, P.M. Congedo, 2015) [1]. The quadratic Renyi entropy criteria together with the principal component analysis (PCA) are applied to the Least Squares Support Vector Machines (LS-SVM) to predict the PV power in the day-ahead time frame. The data here shared represent the proposed approach results. Hourly PV power predictions for 1,3,6,12, 24 ahead hours and for different data reduction sizes are provided in Supplementary material.

  15. Safety Changes in the US Vehicle Fleet since Model Year 1990, Based on NASS Data

    PubMed Central

    Eigen, Ana Maria; Digges, Kennerly; Samaha, Randa Radwan

    2012-01-01

    Based on the National Automotive Sampling System Crashworthiness Data System since the 1988–1992 model years, there has been a reduction in the MAIS 3+ injury rate and the Mean HARM for all crash modes. The largest improvement in vehicle safety has been in rollovers. There was an increase in the rollover injury rate in the 1993–1998 model year period, but a reduction since then. When comparing vehicles of the model year 1993 to 1998 with later model vehicles, the most profound difference was the reduction of rollover frequency for SUV’s – down more than 20% when compared to other crash modes. When considering only model years since 2002 the rollover frequency reduction was nearly 40%. A 26% reduction in the rate of moderate and serious injuries for all drivers in rollovers was observed for the model years later than 1998. The overall belt use rate for drivers of late model vehicles with HARM weighted injuries was 62% - up from 54% in earlier model vehicles. However, in rollover crashes, the same belt use rate lagged at 54%. PMID:23169134

  16. The Kinetic Mechanism for Cytochrome P450 Metabolism of Type II Binding Compounds: Evidence Supporting Direct Reduction

    PubMed Central

    Pearson, Joshua; Dahal, Upendra P.; Rock, Daniel; Peng, Chi-Chi; Schenk, James O.; Joswig-Jones, Carolyn; Jones, Jeffrey P.

    2011-01-01

    The metabolic stability of a drug is an important property that should be optimized during drug design and development. Nitrogen incorporation is hypothesized to increase the stability by coordination of nitrogen to the heme iron of cytochrome P450, a binding mode that is referred to as type II binding. However, we noticed that the type II binding compound 1 has less metabolic stability at subsaturating conditions than a closely related type I binding compound 3. Three kinetic models will be presented for type II binder metabolism; 1) Dead-end type II binding, 2) a rapid equilibrium between type I and II binding modes before reduction, and 3) a direct reduction of the type II coordinated heme. Data will be presented on reduction rates of iron, the off rates of substrate (using surface plasmon resonance) and the catalytic rate constants. These data argue against the dead-end, and rapid equilibrium models, leaving the direct reduction kinetic mechanism for metabolism of the type II binding compound 1. PMID:21530484

  17. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    PubMed

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Effect of sampling rate and record length on the determination of stability and control derivatives

    NASA Technical Reports Server (NTRS)

    Brenner, M. J.; Iliff, K. W.; Whitman, R. K.

    1978-01-01

    Flight data from five aircraft were used to assess the effects of sampling rate and record length reductions on estimates of stability and control derivatives produced by a maximum likelihood estimation method. Derivatives could be extracted from flight data with the maximum likelihood estimation method even if there were considerable reductions in sampling rate and/or record length. Small amplitude pulse maneuvers showed greater degradation of the derivative maneuvers than large amplitude pulse maneuvers when these reductions were made. Reducing the sampling rate was found to be more desirable than reducing the record length as a method of lessening the total computation time required without greatly degrading the quantity of the estimates.

  19. Thermodynamic Investigation of the Reduction-Distillation Process for Rare Earth Metals Production

    NASA Astrophysics Data System (ADS)

    Judge, W. D.; Azimi, G.

    2017-10-01

    Owing to their high vapor pressure, the four rare earth metals samarium, europium, thulium, and ytterbium are produced by reduction-distillation whereby their oxides are reduced with metallic lanthanum in vacuo, and the produced metal is subsequently vaporized off. Here, we performed a thorough thermodynamic investigation to establish a fundamental understanding of the reduction-distillation process. Thermodynamic functions including vapor pressures, Gibbs free energies, and enthalpies of reaction were calculated and compared with available experimental data. Furthermore, the kinetics of the process was explored and theoretical evaporation rates were calculated from thermodynamic data. The thermodynamic model developed in this work can help optimize processing conditions to maximize the yield and improve the overall process.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.

    The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less

  1. Plasma diagnostics package. Volume 2: Spacelab 2 section. Part B: Thesis projects

    NASA Technical Reports Server (NTRS)

    Pickett, Jolene S. (Compiler); Frank, L. A. (Compiler); Kurth, W. S. (Compiler)

    1988-01-01

    This volume (2), which consists of two parts (A and B), of the Plasma Diagnostics Package (PDP) Final Science Report contains a summary of all of the data reduction and scientific analyses which were performed using PDP data obtained on STS-51F as a part of the Spacelab 2 (SL-2) payload. This work was performed during the period of launch, July 29, 1985, through June 30, 1988. During this period the primary data reduction effort consisted of processing summary plots of the data received by 12 of the 14 instruments located on the PDP and submitting these data to the National Space Science Data Center (NSSDC). Three Master's and three Ph.D. theses were written using PDP instrumentation data. These theses are listed in Volume 2, Part B.

  2. Light curves of flat-spectrum radio sources (Jenness+, 2010)

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Robson, E. I.; Stevens, J. A.

    2010-05-01

    Calibrated data for 143 flat-spectrum extragalactic radio sources are presented at a wavelength of 850um covering a 5-yr period from 2000 April. The data, obtained at the James Clerk Maxwell Telescope using the Submillimetre Common-User Bolometer Array (SCUBA) camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control - Data Reduction (ORAC-DR) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves. A re-analysis of previously published data from 1997 to 2000 is also presented. The combined catalogue, comprising 10493 flux density measurements, provides a unique and valuable resource for studies of extragalactic radio sources. (2 data files).

  3. Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities

    NASA Technical Reports Server (NTRS)

    Ross, Richard W.

    2001-01-01

    The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.

  4. Algorithm for AEEG data selection leading to wireless and long term epilepsy monitoring.

    PubMed

    Casson, Alexander J; Yates, David C; Patel, Shyam; Rodriguez-Villegas, Esther

    2007-01-01

    High quality, wireless ambulatory EEG (AEEG) systems that can operate over extended periods of time are not currently feasible due to the high power consumption of wireless transmitters. Previous work has thus proposed data reduction by only transmitting sections of data that contain candidate epileptic activity. This paper investigates algorithms by which this data selection can be carried out. It is essential that the algorithm is low power and that all possible features are identified, even at the expense of more false detections. Given this, a brief review of spike detection algorithms is carried out with a view to using these algorithms to drive the data reduction process. A CWT based algorithm is deemed most suitable for use and an algorithm is described in detail and its performance tested. It is found that over 90% of expert marked spikes are identified whilst giving a 40% reduction in the amount of data to be transmitted and analysed. The performance varies with the recording duration in response to each detection and this effect is also investigated. The proposed algorithm will form the basis of new a AEEG system that allows wireless and longer term epilepsy monitoring.

  5. Comparative analysis of death by suicide in Brazil and in the United States: descriptive, cross-sectional time series study.

    PubMed

    Abuabara, Alexander; Abuabara, Allan; Tonchuk, Carin Albino Luçolli

    2017-01-01

    The World Health Organization recognizes suicide as a public health priority. Increased knowledge of suicide risk factors is needed in order to be able to adopt effective prevention strategies. The aim of this study was to analyze and compare the association between the Gini coefficient (which is used to measure inequality) and suicide death rates over a 14-year period (2000-2013) in Brazil and in the United States (US). The hypothesis put forward was that reduction of income inequality is accompanied by reduction of suicide rates. Descriptive cross-sectional time-series study in Brazil and in the US. Population, death and suicide death data were extracted from the DATASUS database in Brazil and from the National Center for Health Statistics in the US. Gini coefficient data were obtained from the World Development Indicators. Time series analysis was performed on Brazilian and American official data regarding the number of deaths caused by suicide between 2000 and 2013 and the Gini coefficients of the two countries. The suicide trends were examined and compared. Brazil and the US present converging Gini coefficients, mainly due to reduction of inequality in Brazil over the last decade. However, suicide rates are not converging as hypothesized, but are in fact rising in both countries. The hypothesis that reduction of income inequality is accompanied by reduction of suicide rates was not verified.

  6. Characterization of drinking water treatment for virus risk assessment.

    PubMed

    Teunis, P F M; Rutjes, S A; Westrell, T; de Roda Husman, A M

    2009-02-01

    Removal or inactivation of viruses in drinking water treatment processes can be quantified by measuring the concentrations of viruses or virus indicators in water before and after treatment. Virus reduction is then calculated from the ratio of these concentrations. Most often only the average reduction is reported. That is not sufficient when treatment efficiency must be characterized in quantitative risk assessment. We present three simple models allowing statistical analysis of series of counts before and after treatment: distribution of the ratio of concentrations, and distribution of the probability of passage for unpaired and paired water samples. Performance of these models is demonstrated for several processes (long and short term storage, coagulation/filtration, coagulation/sedimentation, slow sand filtration, membrane filtration, and ozone disinfection) using microbial indicator data from full-scale treatment processes. All three models allow estimation of the variation in (log) reduction as well as its uncertainty; the results can be easily used in risk assessment. Although they have different characteristics and are present in vastly different concentrations, different viruses and/or bacteriophages appear to show similar reductions in a particular treatment process, allowing generalization of the reduction for each process type across virus groups. The processes characterized in this paper may be used as reference for waterborne virus risk assessment, to check against location specific data, and in case no such data are available, to use as defaults.

  7. Flight Data Reduction of Wake Velocity Measurements Using an Instrumented OV-10 Airplane

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.; Stuever, Robert A.; Stewart, Eric C.; Rivers, Robert A.

    1999-01-01

    A series of flight tests to measure the wake of a Lockheed C- 130 airplane and the accompanying atmospheric state have been conducted. A specially instrumented North American Rockwell OV-10 airplane was used to measure the wake and atmospheric conditions. An integrated database has been compiled for wake characterization and validation of wake vortex computational models. This paper describes the wake- measurement flight-data reduction process.

  8. Characterization of Particle Combustion in a Rijke Burner

    DTIC Science & Technology

    1988-11-01

    Rijke Burner 14 3.1 Introduction 14 3.2 Acoustics 14 3.3 Eperimental Procedure 17 3.3.1 Apparatus 17 3.3.2 Data Reduction 19 3.4 Burner...response of the modified Rijke burner, 2) The experimental procedures, including design modifications of the burner and data reduction, and 3...have been modified and improved significantly. The following sections describe the major design changes made in the modified Rijke burner and its

  9. Chaotic reconfigurable ZCMT precoder for OFDM data encryption and PAPR reduction

    NASA Astrophysics Data System (ADS)

    Chen, Han; Yang, Xuelin; Hu, Weisheng

    2017-12-01

    A secure orthogonal frequency division multiplexing (OFDM) transmission scheme precoded by chaotic Zadoff-Chu matrix transform (ZCMT) is proposed and demonstrated. It is proved that the reconfigurable ZCMT matrices after row/column permutations can be applied as an alternative precoder for peak-to-average power ratio (PAPR) reduction. The permutations and the reconfigurable parameters in ZCMT matrix are generated by a hyper digital chaos, in which a huge key space of ∼ 10800 is created for physical-layer OFDM data encryption. An encrypted data transmission of 8.9 Gb/s optical OFDM signals is successfully demonstrated over 20 km standard single-mode fiber (SSMF) for 16-QAM. The BER performance of the encrypted signals is improved by ∼ 2 dB (BER@ 10-3), which is mainly attributed to the effective reduction of PAPR via chaotic ZCMT precoding. Moreover, the chaotic ZCMT precoding scheme requires no sideband information, thus the spectrum efficiency is enhanced during transmission.

  10. “We as Drug Addicts Need that Program”: Insight from Rural African American Cocaine Users on Designing a Sexual Risk Reduction Intervention for Their Community

    PubMed Central

    Montgomery, Brooke E. E.; Stewart, Katharine E.; Wright, Patricia B.; McSweeney, Jean; Booth, Brenda M.

    2013-01-01

    This focused ethnographic study examines data collected in 2007 from four gender- and age-specific focus groups (FGs) (N = 31) to inform the development of a sexual risk reduction intervention for African American cocaine users in rural Arkansas. A semi-structured protocol was used to guide audio-recorded FGs. Data were entered into Ethnograph and analyzed using constant comparison and content analysis. Four codes with accompanying factors emerged from the data and revealed recommendations for sexual risk reduction interventions with similar populations. Intervention design implications and challenges, study limitations, and future research are discussed. The study was supported by funds from the National Institute of Nursing Research (P20 NR009006-01) and the National Institute on Drug Abuse (1R01DA024575-01 and F31 DA026286-01). PMID:22216991

  11. Analyzing best practices in employee health management: how age, sex, and program components relate to employee engagement and health outcomes.

    PubMed

    Terry, Paul E; Grossmeier, Jessica; Mangen, David J; Gingerich, Stefan B

    2013-04-01

    Examine the influence of employee health management (EHM) best practices on registration, participation, and health behavior change in telephone-based coaching programs. Individual health assessment data, EHM program data, and health coaching participation data were analyzed for associations with coaching program enrollment, active participation, and risk reduction. Multivariate analyses occurred at the individual (n = 205,672) and company levels (n = 55). Considerable differences were found in how age and sex impacted typical EHM evaluation metrics. Cash incentives for the health assessment were associated with more risk reduction for men than for women. Providing either a noncash or a benefits-integrated incentive for completing the health assessment, or a noncash incentive for lifestyle management, strengthened the relationship between age and risk reduction. In EHM programs, one size does not fit all. These results can help employers tailor engagement strategies for their specific population.

  12. Dimension Reduction of Hyperspectral Data on Beowulf Clusters

    NASA Technical Reports Server (NTRS)

    El-Ghazawi, Tarek

    2000-01-01

    Traditional remote sensing instruments are multispectral, where observations are collected at a few different spectral bands. Recently, many hyperspectral instruments, that can collect observations at hundreds of bands, have been operation. Furthermore, there have been ongoing research efforts on ultraspectral instruments that can produce observations at thousands of spectral bands. While these remote sensing technology developments hold a great promise for new findings in the area of Earth and space science, they present many challenges. These include the need for faster processing of such increased data volumes, and methods for data reduction. Dimension Reduction is a spectral transformation, which is used widely in remote sensing, is the Principal Components Analysis (PCA). In light of the growing number of spectral channels of modern instruments, the paper reports on the development of a parallel PCA and its implementation on two Beowulf cluster configurations, on with fast Ethernet switch and the other is with a Myrinet interconnection.

  13. A biomechanical characterisation of acellular porcine super flexor tendons for use in anterior cruciate ligament replacement: Investigation into the effects of fat reduction and bioburden reduction bioprocesses

    PubMed Central

    Herbert, Anthony; Jones, Gemma L.; Ingham, Eileen; Fisher, John

    2015-01-01

    The decellularisation of xenogenic and allogeneic biological grafts offers a promising solution to replacement of the anterior cruciate ligament (ACL). The purpose of this investigation was to determine the biomechanical effects of additional fat reduction and bioburden reduction steps in the decellularisation of porcine super flexor tendon (pSFT). Study 1 investigated the use of acetone or chloroform–methanol as a fat reduction agent. The most effective of these was then carried forward into Study 2, which investigated the use of antibiotics or peracetic acid (PAA) as a bioburden reduction agent. Stress relaxation data was analysed using a Maxwell–Wiechert viscoelastic model and, in addition to classical material properties, the tangent modulus of the toe-region was determined from strength testing data. In both studies, the majority of decellularised groups demonstrated no statistical differences for material properties such as tensile strength and Young’s modulus compared to native controls. Different trends were observed for many of the viscoelastic parameters, but also for the tangent modulus in the toe-region indicating a change in performance at low strains. The most severe deviations from the profile of the native tangent modulus were found to occur in Study 2 when PAA was used for bioburden reduction. Classic material properties (E, UTS etc.) are often used to compare the characteristics of native and decellularised tissues, however they may not highlight changes occurring in the tissues at low strains. In this study, this represented the physiological strains encountered by substitute acellular ACL grafts. Acetone was chosen as the fat reduction step whereas, antibiotics was preferable over PAA as a bioburden reduction step. PMID:25443884

  14. Effects of T-state and R-state stabilization on deoxyhemoglobin-nitrite reactions and stimulation of nitric oxide signaling

    PubMed Central

    Cantu-Medellin, Nadiezhda; Vitturi, Dario A.; Rodriguez, Cilina; Murphy, Serena; Dorman, Scott; Shiva, Sruti; Zhou, Yipin; Jia, Yiping; Palmer, Andre F.; Patel, Rakesh P.

    2011-01-01

    Recent data suggest that transitions between the relaxed (R) and tense (T) state of hemoglobin control the reduction of nitrite to nitric oxide (NO) by deoxyhemoglobin. This reaction may play a role in physiologic NO homeostasis and be a novel consideration for the development of the next generation of hemoglobin-based blood oxygen carriers (HBOCs, i.e. artificial blood substitutes). Herein we tested the effects of chemical stabilization of bovine hemoglobin in either the T- (THb) or R-state (RHb) on nitrite reduction kinetics, NO-gas formation and ability to stimulate NO-dependent signaling. These studies were performed over a range of fractional saturations that is expected to mimic biological conditions. The initial rate for nitrite-reduction decreased in the following order RHb > bHb > THb, consistent with the hypothesis that the rate constant for nitrite reduction is faster with R-state Hb and slower with T-state Hb. Moreover, RHb produced more NO-gas and inhibited mitochondrial respiration more potently than both bHb and THb. Interestingly, at low oxygen fractional saturations, THb produced more NO and stimulated nitrite-dependent vasodilation more potently than bHb despite both derivatives having similar initial rates for nitrite reduction and a more negative reduction potential in THb versus bHb. These data suggest that cross-linking of bovine hemoglobin in the T-state conformation leads to a more effective coupling of nitrite reduction to NO-formation. Our results support the model of allosteric regulation of nitrite reduction by deoxyhemoglobin and show that cross-linking hemoglobins in distinct quaternary states can generate products with increased NO yields from nitrite reduction that could be harnessed to promote NO-signaling in vivo. PMID:21277987

  15. Automating U-Pb IDTIMS data reduction and reporting: Cyberinfrastructure meets geochronology

    NASA Astrophysics Data System (ADS)

    Bowring, J. F.; McLean, N.; Walker, J. D.; Ash, J. M.

    2009-12-01

    We demonstrate the efficacy of an interdisciplinary effort between software engineers and geochemists to produce working cyberinfrastructure for geochronology. This collaboration between CIRDLES, EARTHTIME and EarthChem has produced the software programs Tripoli and U-Pb_Redux as the cyber-backbone for the ID-TIMS community. This initiative incorporates shared isotopic tracers, data-reduction algorithms and the archiving and retrieval of data and results. The resulting system facilitates detailed inter-laboratory comparison and a new generation of cooperative science. The resolving power of geochronological data in the earth sciences is dependent on the precision and accuracy of many isotopic measurements and corrections. Recent advances in U-Pb geochronology have reinvigorated its application to problems such as precise timescale calibration, processes of crustal evolution, and early solar system dynamics. This project provides a heretofore missing common data reduction protocol, thus promoting the interpretation of precise geochronology and enabling inter-laboratory comparison. U-Pb_Redux is an open-source software program that provides end-to-end support for the analysis of uranium-lead geochronological data. The system reduces raw mass spectrometer data to U-Pb dates, allows users to interpret ages from these data, and then provides for the seamless federation of the results, coming from many labs, into a community web-accessible database using standard and open techniques. This EarthChem GeoChron database depends also on keyed references to the SESAR sample database. U-Pb_Redux currently provides interactive concordia and weighted mean plots and uncertainty contribution visualizations; it produces publication-quality concordia and weighted mean plots and customizable data tables. This initiative has achieved the goal of standardizing the data elements of a complete reduction and analysis of uranium-lead data, which are expressed using extensible markup language schema definition (XSD) artifacts. U-Pb_Redux leverages the freeware program Tripoli, which imports raw mass spectrometer data files and supports interactive review and archiving of isotopic data. Tripoli facilitates the visualization of temporal trends and scatter during measurement, statistically rigorous filtering of data and supports oxide and fractionation corrections. The Cyber Infrastructure Research and Development Lab for the Earth Sciences (CIRDLES) collaboratively integrates domain-specific software engineering with the efforts EARTHTIME and Earthchem. The EARTHTIME initiative pursues consensus-based approaches to geochemical data reduction, and the EarthChem initiative pursues the creation of data repositories for all geochemical data. CIRDLES develops software and systems for geochronology. This collaboration benefits the earth sciences by enabling geochemists to focus on their specialties using robust software that produces reliable results. This collaboration benefits software engineering by providing research opportunities to improve process methodologies used in the design and implementation of domain-specific solutions.

  16. Modeling and Reduction With Applications to Semiconductor Processing

    DTIC Science & Technology

    1999-01-01

    smoothies ,” as they kept my energy level high without resorting to coffee (the beverage of choice, it seems, for graduate students). My advisor gave me all...with POC data, and balancing approach. . . . . . . . . . . . . . . . 312 xii LIST OF FIGURES 1.1 General state-space model reduction methodology ...reduction problem, then, is one of finding a systematic methodology within a given mathematical framework to produce an efficient or optimal trade-off of

  17. The Great White Ocean

    NASA Technical Reports Server (NTRS)

    Parkinson, Claire L.

    1999-01-01

    Satellite data have revealed overall decreases in the Arctic sea ice cover since the late 1970s, although with substantial interannual variability. The ice reductions are likely tied to an overall warming in the Arctic region over the same time period, although both the warming and the ice reductions could be connected to large-scale oscillations within the system. Should the ice reductions continue, consequences to the Arctic ecosystems and climate could be considerable.

  18. Electrochemical reductive dehalogenation of iodine-containing contrast agent pharmaceuticals: Examination of reactions of diatrizoate and iopamidol using the method of rotating ring-disc electrode (RRDE).

    PubMed

    Yan, Mingquan; Chen, Zhanghao; Li, Na; Zhou, Yuxuan; Zhang, Chenyang; Korshin, Gregory

    2018-06-01

    This study examined the electrochemical (EC) reduction of iodinated contrast media (ICM) exemplified by iopamidol and diatrizoate. The method of rotating ring-disc electrode (RRDE) was used to elucidate rates and mechanisms of the EC reactions of the selected ICMs. Experiments were carried at varying hydrodynamic conditions, concentrations of iopamidol, diatrizoate, natural organic matter (NOM) and model compounds (resorcinol, catechol, guaiacol) which were used to examine interactions between products of the EC reduction of ICMs and halogenation-active species. The data showed that iopamidol and diatrizoate were EC-reduced at potentials < -0.45 V vs. s.c.e. In the range of potentials -0.65 to -0.85 V their reduction was mass transfer-controlled. The presence of NOM and model compounds did not affect the EC reduction of iopamidol and diatrizoate but active iodine species formed as a result of the EC-induced transformations of these ICMs reacted readily with NOM and model compounds. These data provide more insight into the nature of generation of iodine-containing by-products in the case of reductive degradation of ICMs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Reduction factors for wooden houses due to external γ-radiation based on in situ measurements after the Fukushima nuclear accident.

    PubMed

    Yoshida-Ohuchi, Hiroko; Hosoda, Masahiro; Kanagami, Takashi; Uegaki, Masaki; Tashima, Hideo

    2014-12-18

    For estimation of residents' exposure dose after a nuclear accident, the reduction factor, which is the ratio of the indoor dose to the outdoor dose is essential, as most individuals spend a large portion of their time indoors. After the Fukushima nuclear accident, we evaluated the median reduction factor with an interquartile range of 0.43 (0.34-0.53) based on 522 survey results for 69 detached wooden houses in two evacuation zones, Iitate village and Odaka district. The results indicated no statistically significant difference in the median reduction factor to the representative value of 0.4 given in the International Atomic Energy Agency (IAEA)-TECDOC-225 and 1162. However, with regard to the representative range of the reduction factor, we recommend the wider range of 0.2 to 0.7 or at least 0.2 to 0.6, which covered 87.7% and 80.7% of the data, respectively, rather than 0.2 to 0.5 given in the IAEA document, which covered only 66.5% of the data. We found that the location of the room within the house and area topography, and the use of cement roof tiles had the greatest influence on the reduction factor.

  20. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  1. Jet Noise Reduction by Microjets - A Parametric Study

    NASA Technical Reports Server (NTRS)

    Zaman, K. B. M. Q.

    2010-01-01

    The effect of injecting tiny secondary jets (microjets ) on the radiated noise from a subsonic primary jet is studied experimentally. The microjets are injected on to the primary jet near the nozzle exit with variable port geometry, working fluid and driving pressure. A clear noise reduction is observed that improves with increasing jet pressure. It is found that smaller diameter ports with higher driving pressure, but involving less thrust and mass fraction, can produce better noise reduction. A collection of data from the present as well as past experiments is examined in an attempt to correlate the noise reduction with the operating parameters. The results indicate that turbulent mixing noise reduction, as monitored by OASPL at a shallow angle, correlates with the ratio of jet to primary jet driving pressures normalized by the ratio of corresponding diameters (p d /pjD). With gaseous injection, the spectral amplitudes decrease at lower frequencies while an increase is noted at higher frequencies. It is apparent that this amplitude crossover is at least partly due to shock-associated noise from the underexpanded jets themselves. Such crossover is not seen with water injection since the flow in that case is incompressible and there is no shock-associated noise. Centerline velocity data show that larger noise reduction is accompanied by faster jet decay as well as significant reduction in turbulence intensities. While a physical understanding of the dependence of noise reduction on p d /pjD remains unclear, given this correlation, an analysis explains the observed dependence of the effect on various other parameters.

  2. Sodium intake in US ethnic subgroups and potential impact of a new sodium reduction technology: NHANES Dietary Modeling.

    PubMed

    Fulgoni, Victor L; Agarwal, Sanjiv; Spence, Lisa; Samuel, Priscilla

    2014-12-18

    Because excessive dietary sodium intake is a major contributor to hypertension, a reduction in dietary sodium has been recommended for the US population. Using the National Health and Nutrition Examination Survey (NHANES) 2007-2010 data, we estimated current sodium intake in US population ethnic subgroups and modeled the potential impact of a new sodium reduction technology on sodium intake. NHANES 2007-2010 data were analyzed using The National Cancer Institute method to estimate usual intake in population subgroups. Potential impact of SODA-LO® Salt Microspheres sodium reduction technology on sodium intake was modeled using suggested sodium reductions of 20-30% in 953 foods and assuming various market penetrations. SAS 9.2, SUDAAN 11, and NHANES survey weights were used in all calculations with assessment across age, gender and ethnic groups. Current sodium intake across all population subgroups exceeds the Dietary Guidelines 2010 recommendations and has not changed during the last decade. However, sodium intake measured as a function of food intake has decreased significantly during the last decade for all ethnicities. "Grain Products" and "Meat, Poultry, Fish, & Mixtures" contribute about 2/3rd of total sodium intake. Sodium reduction, using SODA-LO® Salt Microspheres sodium reduction technology (with 100% market penetration) was estimated to be 185-323 mg/day or 6.3-8.4% of intake depending upon age, gender and ethnic group. Current sodium intake in US ethnic subgroups exceeds the recommendations and sodium reduction technologies could potentially help reduce dietary sodium intake among those groups.

  3. Educational inequality in adult mortality: an assessment with death certificate data from Michigan.

    PubMed

    Christenson, B A; Johnson, N E

    1995-05-01

    Education was added to the U.S. Standard Certificate of Death in 1989. The current study uses Michigan's 1989-1991 death certificates, together with the 1990 Census, to evaluate the quality of data on education from death certificates and to examine educational differences in mortality rates. With log-rates modeling, we systematically analyze the variability in educational differences in mortality by race and sex across the adult life cycle. The relative differences in mortality rates between educational levels decline with age at the same pace for all sex and race categories. Women gain a slightly greater reduction in mortality than men by reaching the secondary-education level, but a modestly smaller reduction by advancing beyond it. Blacks show a reduction in predicted mortality rates comparable to whites' by moving from the secondary to the postsecondary level of education but experience less reduction than whites by moving from the primary to the secondary level. Thus, the secular decline in mortality rates that generally accompanies historical improvements in education might actually be associated with an increase in the relative differences between blacks' and whites' mortality. We discuss limitations of the data and directions for future research.

  4. Reduction of EEG artefacts induced by vibration in the MR-environment.

    PubMed

    Rothlübbers, Sven; Relvas, Vânia; Leal, Alberto; Figueiredo, Patrícia

    2013-01-01

    The EEG acquired simultaneously with functional magnetic resonance imaging (fMRI) is distorted by a number of artefacts related to the presence of strong magnetic fields. In order to allow for a useful interpretation of the EEG data, it is necessary to reduce these artefacts. For the two most prominent artefacts, associated with magnetic field gradient switching and the heart beat, reduction methods have been developed and applied successfully. Due to their repetitive nature, such artefacts can be reduced by subtraction of the respective template retrieved by averaging across cycles. In this paper, we investigate additional artefacts related to the MR environment and propose a method for the reduction of the vibration artefact caused by the cryo-cooler compression pumps system. Data were collected from the EEG cap placed on an MR head phantom, in order to characterise the MR environment related artefacts. Since the vibration artefact was found to be repetitive, a template subtraction method was developed for its reduction, and this was then adjusted to meet the specific requirements of patient data. The developed methodology successfully reduced the vibration artefact by about 90% in five EEG-fMRI datasets collected from two epilepsy patients.

  5. Development of an Efficient Binaural Simulation for the Analysis of Structural Acoustic Data

    NASA Technical Reports Server (NTRS)

    Johnson, Marty E.; Lalime, Aimee L.; Grosveld, Ferdinand W.; Rizzi, Stephen A.; Sullivan, Brenda M.

    2003-01-01

    Applying binaural simulation techniques to structural acoustic data can be very computationally intensive as the number of discrete noise sources can be very large. Typically, Head Related Transfer Functions (HRTFs) are used to individually filter the signals from each of the sources in the acoustic field. Therefore, creating a binaural simulation implies the use of potentially hundreds of real time filters. This paper details two methods of reducing the number of real-time computations required by: (i) using the singular value decomposition (SVD) to reduce the complexity of the HRTFs by breaking them into dominant singular values and vectors and (ii) by using equivalent source reduction (ESR) to reduce the number of sources to be analyzed in real-time by replacing sources on the scale of a structural wavelength with sources on the scale of an acoustic wavelength. The ESR and SVD reduction methods can be combined to provide an estimated computation time reduction of 99.4% for the structural acoustic data tested. In addition, preliminary tests have shown that there is a 97% correlation between the results of the combined reduction methods and the results found with the current binaural simulation techniques

  6. Wind-Tunnel Investigations of Blunt-Body Drag Reduction Using Forebody Surface Roughness

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Sprague, Stephanie; Naughton, Jonathan W.; Curry, Robert E. (Technical Monitor)

    2001-01-01

    This paper presents results of wind-tunnel tests that demonstrate a novel drag reduction technique for blunt-based vehicles. For these tests, the forebody roughness of a blunt-based model was modified using micomachined surface overlays. As forebody roughness increases, boundary layer at the model aft thickens and reduces the shearing effect of external flow on the separated flow behind the base region, resulting in reduced base drag. For vehicle configurations with large base drag, existing data predict that a small increment in forebody friction drag will result in a relatively large decrease in base drag. If the added increment in forebody skin drag is optimized with respect to base drag, reducing the total drag of the configuration is possible. The wind-tunnel tests results conclusively demonstrate the existence of a forebody dragbase drag optimal point. The data demonstrate that the base drag coefficient corresponding to the drag minimum lies between 0.225 and 0.275, referenced to the base area. Most importantly, the data show a drag reduction of approximately 15% when the drag optimum is reached. When this drag reduction is scaled to the X-33 base area, drag savings approaching 45,000 N (10,000 lbf) can be realized.

  7. CCDLAB: A Graphical User Interface FITS Image Data Reducer, Viewer, and Canadian UVIT Data Pipeline

    NASA Astrophysics Data System (ADS)

    Postma, Joseph E.; Leahy, Denis

    2017-11-01

    CCDLAB was originally developed as a FITS image data reducer and viewer, and development was then continued to provide ground support for the development of the UVIT detector system provided by the Canadian Space Agency to the Indian Space Research Organization’s ASTROSAT satellite and UVIT telescopes. After the launch of ASTROSAT and during UVIT’s first-light and PV phase starting in 2015 December, necessity required the development of a data pipeline to produce scientific images out of the Level 1 format data produced for UVIT by ISRO. Given the previous development of CCDLAB for UVIT ground support, the author provided a pipeline for the new Level 1 format data to be run through CCDLAB with the additional satellite-dependent reduction operations required to produce scientific data. Features of the pipeline are discussed with focus on the relevant data-reduction challenges intrinsic to UVIT data.

  8. HypsIRI On-Board Science Data Processing

    NASA Technical Reports Server (NTRS)

    Flatley, Tom

    2010-01-01

    Topics include On-board science data processing, on-board image processing, software upset mitigation, on-board data reduction, on-board 'VSWIR" products, HyspIRI demonstration testbed, and processor comparison.

  9. PICARD - A PIpeline for Combining and Analyzing Reduced Data

    NASA Astrophysics Data System (ADS)

    Gibb, Andrew G.; Jenness, Tim; Economou, Frossie

    PICARD is a facility for combining and analyzing reduced data, normally the output from the ORAC-DR data reduction pipeline. This document describes an introduction to using PICARD for processing instrument-independent data.

  10. Spacelab user interaction

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The results of the third and final phase of a study undertaken to define means of optimizing the Spacelab experiment data system by interactively manipulating the flow of data were presented. A number of payload applicable interactive techniques and an integrated interaction system for each of two possible payloads are described. These interaction systems have been functionally defined and are accompanied with block diagrams, hardware specifications, software sizing and speed requirements, operational procedures and cost/benefits analysis data for both onboard and ground based system elements. It is shown that accrued benefits are attributable to a reduction in data processing costs obtained by, generally, a considerable reduction in the quantity of data that might otherwise be generated without interaction. One other additional anticipated benefit includes the increased scientific value obtained by the quicker return of all useful data.

  11. An experimental investigation of the effect of temperature and space velocity on the performance of a cu-zeolite flow-through SCR and a SCR catalyst on a DPF with and without PM loading

    NASA Astrophysics Data System (ADS)

    Kadam, Vaibhav

    The heavy-duty diesel (HDD) engines use the diesel oxidation catalyst (DOC), catalyzed particulate filter (CPF) and urea injection based selective catalytic reduction (SCR) systems in sequential combination, to meet the US EPA 2010 PM and NOx emission standards. The SCR along with a NH 3 slip control catalyst (AMOX) offer NOx reduction >90 % with NH3 slip <20 ppm. However, there is a strong desire to further improve the NOx reduction performance of such systems, to meet the California Optional Low NOx Standard implemented since 2015. Integrating SCR functionality into a diesel particulate filter (DPF), by coating the SCR catalyst on the DPF, offers potential to reduce the system cost and packaging weight/ volume. It also provides opportunity to increases the SCR volume without affecting the overall packaging, to achieve NO x reduction efficiencies >95 %. In this research, the NOx reduction and NH3 storage performance of a Cu-zeolite SCR and Cu-zeolite SCR catalyst on DPF (SCRFRTM) were experimentally investigated based on the engine experimental data at steady state conditions. The experimental data for the production-2013-SCR and the SCRFRTM were collected (with and without PM loading in the SCRFRTM) on a Cummins ISB 2013 engine, at varying inlet temperatures, space velocities, inlet NOx concentrations and NO2/NOx ratios, to evaluate the NOx reduction, NH3 storage and NH 3 slip characteristics of the SCR catalyst. The SCRFRTM was loaded with 2 and 4 g/L of PM prior to the NOx reduction tests to study the effect of PM loading on the NOx reduction and NH3 storage performance of the SCRFRTM. The experimental setup and test procedures for evaluation of NOx reduction performance of the SCRFRTM, with and without PM loading in the SCRFRTM are described. The 1-D SCR model developed at MTU was calibrated to the engine experimental data obtained from the seven NOx reduction tests conducted with the production-2013-SCR. The performance of the 1-D SCR model was validated by comparing the simulation and experimental data for NO, NO2 and NH3 concentrations at the outlet of the SCR. The NO and NO 2 concentrations were calibrated to +/-20 ppm and NH3 was calibrated to +/-20 ppm. The experimental results for the production-2013-SCR indicate that the NOx reduction of 80 - 85% can be achieved for the inlet temperatures below 250°C and above 450°C and NO x reduction of 90 - 95% can be achieved for the inlet temperatures between 300 - 350°C, at ammonia to NO2 ratio (ANR) 1.0, while the NH3 slip out of the SCR was <75 ppm. Conversely, the SCRFRTM showed 90 - 95 % NOx reduction at ANR of 1.0, while the NH3 slip out of the SCRFRTM was >50 ppm, with and without PM loading in the SCRFRTM, for the inlet temperature range of 200 - 450 °C, space velocity in the range of 13 to 48 k/hr and inlet NO 2/NOx in the range of 0.2 to 0.5. The NOx reduction in the SCRFRTM increases to >98 % at ANR 1.2. However, the NH3 slip out of the SCRFRTM increases significantly at ANR 1.2. The effect of PM loading at 2 and 4 g/L on the NOx reduction performance of the SCRFRTM was negligible below 300 °C. However, with PM loading in the SCRFRTM, the NO2 reduction decreased by 3 - 5% when compared to the clean SCRFRTM, for inlet temperature >350 °C. Experimental data were also collected by reference [1] to investigate the NO2 assisted PM oxidation in the SCRFRTM for the inlet temperature range of 260 - 370 °C, with and without urea injection and thermal oxidation of PM in the SCRFRTM for the inlet temperature range of 500 - 600 °C, without urea injection by reference [1]. The experimental data obtained from this study and [1] will be used to develop and calibrate the SCR-F model at Michigan Tech. The NH3 storage for the production-2013-SCR and the SCRFRTM (with and without PM loading) were determined from the steady state engine experimental data. The NH3 storage for the production-2013-SCR and the SCRFRTM (without PM loading) were within +/-5 gmol/m 3 of the substrate, with maximum NH3 storage of 75 - 80 gmol/m3 of the substrate, at the SCR/SCRFRTM inlet temperature of 200°C. The NH3 storage in the SCRFRTM, with 2 g/L PM loading, decreased by 30%, when compared to the NH3 storage in the SCRFRTM, without PM loading. The further increase in the PM loading in the SCRFRTM, from 2 to 4 g/L, had negligible effect on NH 3 storage.

  12. Geochemical modeling of iron, sulfur, oxygen and carbon in a coastal plain aquifer

    USGS Publications Warehouse

    Brown, C.J.; Schoonen, M.A.A.; Candela, J.L.

    2000-01-01

    Fe(III) reduction in the Magothy aquifer of Long Island, NY, results in high dissolved-iron concentrations that degrade water quality. Geochemical modeling was used to constrain iron-related geochemical processes and redox zonation along a flow path. The observed increase in dissolved inorganic carbon is consistent with the oxidation of sedimentary organic matter coupled to the reduction of O2 and SO4/2- in the aerobic zone, and to the reduction of SO4/2- in the anaerobic zone; estimated rates of CO2 production through reduction of Fe(III) were relatively minor by comparison. The rates of CO2 production calculated from dissolved inorganic carbon mass transfer (2.55 x 10-4 to 48.6 x 10-4 mmol 1-1 yr-1) generally were comparable to the calculated rates of CO2 production by the combined reduction of O2, Fe(III) and SO4/2- (1.31 x 10-4 to 15 x 10-4 mmol 1-1 yr-1). The overall increase in SO4/2- concentrations along the flow path, together with the results of mass-balance calculations, and variations in ??34S values along the flow path indicate that SO4/2- loss through microbial reduction is exceeded by SO4/2- gain through diffusion from sediments and through the oxidation of FeS2. Geochemichal and microbial data on cores indicate that Fe(III) oxyhydroxide coatings on sediment grains in local, organic carbon- and SO4/2- -rich zones have localized SO4/2- -reducing zones in which the formation of iron disulfides been depleted by microbial reduction and resulted in decreases dissolved iron concentrations. These localized zones of SO4/2- reduction, which are important for assessing zones of low dissolved iron for water-supply development, could be overlooked by aquifer studies that rely only on groundwater data from well-water samples for geochemical modeling. (C) 2000 Elsevier Science B.V.Fe(III) reduction in the Magothy aquifer of Long Island, NY, results in high dissolved-iron concentrations that degrade water quality. Geochemical modeling was used to constrain iron-related geochemical processes and redox zonation along a flow path. The observed increase in dissolved inorganic carbon is consistent with the oxidation of sedimentary organic matter coupled to the reduction of O2 and SO42- in the aerobic zone, and to the reduction of SO42- in the anaerobic zone; estimated rates of CO2 production through reduction of Fe(III) were relatively minor by comparison. The rates of CO2 production calculated from dissolved inorganic carbon mass transfer (2.55??10-4 to 48.6??10-4mmol l-1yr-1) generally were comparable to the calculated rates of CO2 production by the combined reduction of O2, Fe(III) and SO42- (1.31??10-4 to 15??10-4mmol l-1yr-1). The overall increase in SO42- concentrations along the flow path, together with the results of mass-balance calculations, and variations in ??34S values along the flow path indicate that SO42- loss through microbial reduction is exceeded by SO42- gain through diffusion from sediments and through the oxidation of FeS2. Geochemical and microbial data on cores indicate that Fe(III) oxyhydroxide coatings on sediment grains in local, organic carbon- and SO42--rich zones have been depleted by microbial reduction and resulted in localized SO42--reducing zones in which the formation of iron disulfides decreases dissolved iron concentrations. These localized zones of SO42- reduction, which are important for assessing zones of low dissolved iron for water-supply development, could be overlooked by aquifer studies that rely only on groundwater data from well-water samples for geochemical modeling.

  13. Feasibility study of microprocessor systems suitable for use in developing a real-time for the 4.75 GHz scatterometer

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A class of signal processors suitable for the reduction of radar scatterometer data in real time was developed. The systems were applied to the reduction of single polarized 13.3 GHz scatterometer data and provided a real time output of radar scattering coefficient as a function of incident angle. It was proposed that a system for processing of C band radar data be constructed to support scatterometer system currently under development. The establishment of a feasible design approach to the development of this processor system utilizing microprocessor technology was emphasized.

  14. R suite for the Reduction and Analysis of UFO Orbit Data

    NASA Astrophysics Data System (ADS)

    Campbell-Burns, P.; Kacerek, R.

    2016-02-01

    This paper presents work undertaken by UKMON to compile a suite of simple R scripts for the reduction and analysis of meteor data. The application of R in this context is by no means an original idea and there is no doubt that it has been used already in many reports to the IMO. However, we are unaware of any common libraries or shared resources available to the meteor community. By sharing our work we hope to stimulate interest and discussion. Graphs shown in this paper are illustrative and are based on current data from both EDMOND and UKMON.

  15. Amateur Image Pipeline Processing using Python plus PyRAF

    NASA Astrophysics Data System (ADS)

    Green, Wayne

    2012-05-01

    A template pipeline spanning observing planning to publishing is offered as a basis for establishing a long term observing program. The data reduction pipeline encapsulates all policy and procedures, providing an accountable framework for data analysis and a teaching framework for IRAF. This paper introduces the technical details of a complete pipeline processing environment using Python, PyRAF and a few other languages. The pipeline encapsulates all processing decisions within an auditable framework. The framework quickly handles the heavy lifting of image processing. It also serves as an excellent teaching environment for astronomical data management and IRAF reduction decisions.

  16. Chromium isotope variation along a contaminated groundwater plume: a coupled Cr(VI)- reduction, advective mixing perspective

    NASA Astrophysics Data System (ADS)

    Bullen, T.; Izbicki, J.

    2007-12-01

    Chromium (Cr) is a common contaminant in groundwater, used in electroplating, leather tanning, wood preservation, and as an anti-corrosion agent. Cr occurs in two oxidation states in groundwater: Cr(VI) is highly soluble and mobile, and is a carcinogen; Cr(III) is generally insoluble, immobile and less toxic than Cr(VI). Reduction of Cr(VI) to Cr(III) is thus a central issue in approaches to Cr(VI) contaminant remediation in aquifers. Aqueous Cr(VI) occurs mainly as the chromate (CrO22-) and bichromate (HCrO2-) oxyanions, while Cr(III) is mainly "hexaquo" Cr(H2O)63+. Cr has four naturally-occurring stable isotopes: 50Cr, 52Cr, 53Cr and 54Cr. When Cr(VI) is reduced to Cr(III), the strong Cr-O bond must be broken, resulting in isotopic selection. Ellis et al. (2002) demonstrated that for reduction of Cr(VI) on magnetite and in natural sediment slurries, the change of isotopic composition of the remnant Cr(VI) pool was described by a Rayleigh fractionation model having fractionation factor ɛCr(VI)-Cr(III) = 3.4‰. We attempted to use Cr isotopes as a monitor of Cr(VI) reduction at a field site in Hinkley, California (USA) where groundwater contaminated with Cr(VI) has been under assessment for remediation. Groundwater containing up to 5 ppm Cr(VI) has migrated down-gradient from the contamination source through the fluvial to alluvial sediments to form a well-defined plume. Uncontaminated groundwater in the aquifer immediately adjacent to the plume has naturally-occurring Cr(VI) of 4 ppb or less (CH2M-Hill). In early 2006, colleagues from CH2M-Hill collected 17 samples of groundwater from within and adjacent to the plume. On a plot of δ53Cr vs. log Cr(VI), the data array is strikingly linear and differs markedly from the trend predicted for reduction of Cr(VI) in the contaminated water. There appear to be two groups of data: four samples with δ53Cr >+2‰ and Cr(VI) <4 ppb, and 13 samples with δ53Cr <+2‰ and Cr(VI) >15 ppb. Simple mixing lines between the groundwater samples having <4 ppb Cr(VI), taken to be representative of regional groundwater, and the contaminated water do not pass through the remainder of the data, discounting a simple advective mixing scenario. We hypothesize a more likely scenario that involves both Cr(VI) reduction and advective mixing. As the plume initially expands downgradient, Cr(VI) in water at the leading edge encounters reductant in the aquifer resulting in limited Cr(VI) reduction. As a result of reduction, δ53Cr of Cr(VI) remaining in solution at the leading edge increases along the "reduction" trend from 0 to ~+2‰. Inevitable mixing of this water at the leading edge with regional groundwater results in a suitable mixing end-member to combine with Cr(VI) within the plume in order to explain the bulk of the remaining data. Neither Cr(VI) reduction nor advective mixing of plume and regional groundwaters can explain the data on their own, implying an interplay of at least these two processes during plume evolution. Ellis, A.S., Johnson, T.M. and Bullen, T.D. 2002, Science, 295, 2060-2062.

  17. ORAC-DR: A generic data reduction pipeline infrastructure

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie

    2015-03-01

    ORAC-DR is a general purpose data reduction pipeline system designed to be instrument and observatory agnostic. The pipeline works with instruments as varied as infrared integral field units, imaging arrays and spectrographs, and sub-millimeter heterodyne arrays and continuum cameras. This paper describes the architecture of the pipeline system and the implementation of the core infrastructure. We finish by discussing the lessons learned since the initial deployment of the pipeline system in the late 1990s.

  18. Development of a guideline for work zone diversion rate and capacity reduction.

    DOT National Transportation Integrated Search

    2016-03-01

    This study develops a comprehensive guideline to estimate the traffic diversion rates and capacity reduction for : work zones. The analysis of the traffic diversion patterns with data from past work zones in the metro freeway : network in Minnesota r...

  19. EMISSIONS REDUCTION DATA FOR GRID-CONNECTED PHOTOVOLTAIC POWER SYSTEMS

    EPA Science Inventory

    This study measured the pollutant emission reduction potential of 29 photovoltaic (PV) systems installed on residential and commercial building rooftops across the U.S. from 1993 through 1997. The U.S. Environmental Protection Agency (EPA) and 21 electric power companies sponsor...

  20. Identification of seedling cabbages and weeds using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Target detectionis one of research focues for precision chemical application. This study developed a method to identify seedling cabbages and weeds using hyperspectral spectral imaging. In processing the image data, with ENVI software, after dimension reduction, noise reduction, de-correlation for h...

  1. QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIPS FOR CHEMICAL REDUCTIONS OF ORGANIC CONTAMINANTS

    EPA Science Inventory

    Sufficient kinetic data on abiotic reduction reactions involving organic contaminants are now available that quantitative structure-activity relationships (QSARs) for these reactions can be developed. Over 50 QSARs have been reported, most in just the last few years, and they ar...

  2. Noise reduction tests of large-scale-model externally blown flap using trailing-edge blowing and partial flap slot covering. [jet aircraft noise reduction

    NASA Technical Reports Server (NTRS)

    Mckinzie, D. J., Jr.; Burns, R. J.; Wagner, J. M.

    1976-01-01

    Noise data were obtained with a large-scale cold-flow model of a two-flap, under-the-wing, externally blown flap proposed for use on future STOL aircraft. The noise suppression effectiveness of locating a slot conical nozzle at the trailing edge of the second flap and of applying partial covers to the slots between the wing and flaps was evaluated. Overall-sound-pressure-level reductions of 5 db occurred below the wing in the flyover plane. Existing models of several noise sources were applied to the test results. The resulting analytical relation compares favorably with the test data. The noise source mechanisms were analyzed and are discussed.

  3. The SCUBA map reduction cookbook

    NASA Astrophysics Data System (ADS)

    Sandell, G.; Jessop, N.; Jenness, T.

    This cookbook tells you how to reduce and analyze maps obtained with SCUBA using the off-line SCUBA reduction package, SURF, and the Starlink KAPPA, Figaro, GAIA and CONVERT applications. The easiest way of using these packages is to run-up ORAC-DR, a general purpose pipeline for reducing data from any telescope. A set of data reduction recipes are available to ORAC-DR for use when working with scuba maps, these recipes utilize the SURF and KAPPA packages. This cookbook makes no attempts to explain why and how, for that there is a comprehensive Starlink User Note 216 which properly documents all the software tasks in SURF, which should be consulted for those who need to know details of a task, or how the task really works.

  4. Restricted Boltzmann machines based oversampling and semi-supervised learning for false positive reduction in breast CAD.

    PubMed

    Cao, Peng; Liu, Xiaoli; Bao, Hang; Yang, Jinzhu; Zhao, Dazhe

    2015-01-01

    The false-positive reduction (FPR) is a crucial step in the computer aided detection system for the breast. The issues of imbalanced data distribution and the limitation of labeled samples complicate the classification procedure. To overcome these challenges, we propose oversampling and semi-supervised learning methods based on the restricted Boltzmann machines (RBMs) to solve the classification of imbalanced data with a few labeled samples. To evaluate the proposed method, we conducted a comprehensive performance study and compared its results with the commonly used techniques. Experiments on benchmark dataset of DDSM demonstrate the effectiveness of the RBMs based oversampling and semi-supervised learning method in terms of geometric mean (G-mean) for false positive reduction in Breast CAD.

  5. Biochemical characterization of ethanol-dependent reduction of furfural by alcohol dehydrogenases.

    PubMed

    Li, Qunrui; Metthew Lam, L K; Xun, Luying

    2011-11-01

    Lignocellulosic biomass is usually converted to hydrolysates, which consist of sugars and sugar derivatives, such as furfural. Before yeast ferments sugars to ethanol, it reduces toxic furfural to non-inhibitory furfuryl alcohol in a prolonged lag phase. Bioreduction of furfural may shorten the lag phase. Cupriavidus necator JMP134 rapidly reduces furfural with a Zn-dependent alcohol dehydrogenase (FurX) at the expense of ethanol (Li et al. 2011). The mechanism of the ethanol-dependent reduction of furfural by FurX and three homologous alcohol dehydrogenases was investigated. The reduction consisted of two individual reactions: ethanol-dependent reduction of NAD(+) to NADH and then NADH-dependent reduction of furfural to furfuryl alcohol. The kinetic parameters of the coupled reaction and the individual reactions were determined for the four enzymes. The data indicated that limited NADH was released in the coupled reaction. The enzymes had high affinities for NADH (e.g., K ( d ) of 0.043 μM for the FurX-NADH complex) and relatively low affinities for NAD(+) (e.g., K ( d ) of 87 μM for FurX-NAD(+)). The kinetic data suggest that the four enzymes are efficient "furfural reductases" with either ethanol or NADH as the reducing power. The standard free energy change (ΔG°') for ethanol-dependent reduction of furfural was determined to be -1.1 kJ mol(-1). The physiological benefit for ethanol-dependent reduction of furfural is likely to replace toxic and recalcitrant furfural with less toxic and more biodegradable acetaldehyde.

  6. Broadband Shock Noise Reduction in Turbulent Jets by Water Injection

    NASA Technical Reports Server (NTRS)

    Kandula, Max

    2008-01-01

    The concept of effective jet properties introduced by the author (AIAA-2007-3 645) has been extended to the estimation of broadband shock noise reduction by water injection in supersonic jets. Comparison of the predictions with the test data for cold underexpanded supersonic nozzles shows a satisfactory agreement. The results also reveal the range of water mass flow rates over which saturation of mixing noise reduction and existence of parasitic noise are manifest.

  7. Translations on USSR Science and Technology, Physical Sciences and Technology, Number 49.

    DTIC Science & Technology

    1978-09-20

    significant reduction in the times and now a reduction in the cost of the work), and data from the surveys of the incomes of families of workers...computer equipment, it provides comprehensive elaboration of the accounting and statistical material with a reduction in the cost of the work, and...themselves, while actively developing under conditons of space flight? We have already written about hydrogenous bacteria (TEKHNIKA — MOLODEZHI, No 4

  8. On the use of total aerobic spore bacteria to make treatment decisions due to Cryptosporidium risk at public water system wells.

    PubMed

    Berger, Philip; Messner, Michael J; Crosby, Jake; Vacs Renwick, Deborah; Heinrich, Austin

    2018-05-01

    Spore reduction can be used as a surrogate measure of Cryptosporidium natural filtration efficiency. Estimates of log10 (log) reduction were derived from spore measurements in paired surface and well water samples in Casper Wyoming and Kearney Nebraska. We found that these data were suitable for testing the hypothesis (H 0 ) that the average reduction at each site was 2 log or less, using a one-sided Student's t-test. After establishing data quality objectives for the test (expressed as tolerable Type I and Type II error rates), we evaluated the test's performance as a function of the (a) true log reduction, (b) number of paired samples assayed and (c) variance of observed log reductions. We found that 36 paired spore samples are sufficient to achieve the objectives over a wide range of variance, including the variances observed in the two data sets. We also explored the feasibility of using smaller numbers of paired spore samples to supplement bioparticle counts for screening purposes in alluvial aquifers, to differentiate wells with large volume surface water induced recharge from wells with negligible surface water induced recharge. With key assumptions, we propose a normal statistical test of the same hypothesis (H 0 ), but with different performance objectives. As few as six paired spore samples appear adequate as a screening metric to supplement bioparticle counts to differentiate wells in alluvial aquifers with large volume surface water induced recharge. For the case when all available information (including failure to reject H 0 based on the limited paired spore data) leads to the conclusion that wells have large surface water induced recharge, we recommend further evaluation using additional paired biweekly spore samples. Published by Elsevier GmbH.

  9. Taking the Initiative: Risk-Reduction Strategies and Decreased Malpractice Costs.

    PubMed

    Raper, Steven E; Rose, Deborah; Nepps, Mary Ellen; Drebin, Jeffrey A

    2017-11-01

    To heighten awareness of attending and resident surgeons regarding strategies for defense against malpractice claims, a series of risk reduction initiatives have been carried out in our Department of Surgery. We hypothesized that emphasis on certain aspects of risk might be associated with decreased malpractice costs. The relative impact of Department of Surgery initiatives was assessed when compared with malpractice experience for the rest of the Clinical Practices of the University of Pennsylvania (CPUP). Surgery and CPUP malpractice claims, indemnity, and expenses were obtained from the Office of General Counsel. Malpractice premium data were obtained from CPUP finance. The Department of Surgery was assessed in comparison with all other CPUP departments. Cost data (yearly indemnity and expenses), and malpractice premiums (total and per physician) were expressed as a percentage of the 5-year mean value preceding implementation of the initiative program. Surgery implemented 38 risk reduction initiatives. Faculty participated in 27 initiatives; house staff participated in 10 initiatives; and advanced practitioners in 1 initiative. Department of Surgery claims were significantly less than CPUP (74.07% vs 81.07%; p < 0.05). The mean yearly indemnity paid by the Department of Surgery was significantly less than that of the other CPUP departments (84.08% vs 122.14%; p < 0.05). Department of Surgery-paid expenses were also significantly less (83.17% vs 104.96%; p < 0.05), and surgical malpractice premiums declined from baseline, but remained significantly higher than CPUP premiums. The data suggest that educating surgeons on malpractice and risk reduction may play a role in decreasing malpractice costs. Additional extrinsic factors may also affect cost data. Emphasis on risk reduction appears to be cumulative and should be part of an ongoing program. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Data on evolutionary relationships between hearing reduction with history of disease and injuries among workers in Abadan Petroleum Refinery, Iran.

    PubMed

    Mohammadi, Mohammad Javad; Ghazlavi, Ebtesam; Gamizji, Samira Rashidi; Sharifi, Hajar; Gamizji, Fereshteh Rashidi; Zahedi, Atefeh; Geravandi, Sahar; Tahery, Noorollah; Yari, Ahmad Reza; Momtazan, Mahboobeh

    2018-02-01

    The present work examined data obtained during the analysis of Hearing Reduction (HR) of Abadan Petroleum Refinery (Abadan PR) workers of Iran with a history of disease and injuries. To this end, all workers in the refinery were chosen. In this research, the effects of history of disease and injury including trauma, electric shock, meningitis-typhoid disease and genetic illness as well as contact with lead, mercury, CO 2 and alcohol consumption were evaluated (Lie, et al., 2016) [1]. After the completion of the questionnaires by workers, the coded data were fed into EXCELL. Statistical analysis of data was carried out, using SPSS 16.

  11. COSMOS: Carnegie Observatories System for MultiObject Spectroscopy

    NASA Astrophysics Data System (ADS)

    Oemler, A.; Clardy, K.; Kelson, D.; Walth, G.; Villanueva, E.

    2017-05-01

    COSMOS (Carnegie Observatories System for MultiObject Spectroscopy) reduces multislit spectra obtained with the IMACS and LDSS3 spectrographs on the Magellan Telescopes. It can be used for the quick-look analysis of data at the telescope as well as for pipeline reduction of large data sets. COSMOS is based on a precise optical model of the spectrographs, which allows (after alignment and calibration) an accurate prediction of the location of spectra features. This eliminates the line search procedure which is fundamental to many spectral reduction programs, and allows a robust data pipeline to be run in an almost fully automatic mode, allowing large amounts of data to be reduced with minimal intervention.

  12. Software manual for operating particle displacement tracking data acquisition and reduction system

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1991-01-01

    The software manual is presented. The necessary steps required to record, analyze, and reduce Particle Image Velocimetry (PIV) data using the Particle Displacement Tracking (PDT) technique are described. The new PDT system is an all electronic technique employing a CCD video camera and a large memory buffer frame-grabber board to record low velocity (less than or equal to 20 cm/s) flows. Using a simple encoding scheme, a time sequence of single exposure images are time coded into a single image and then processed to track particle displacements and determine 2-D velocity vectors. All the PDT data acquisition, analysis, and data reduction software is written to run on an 80386 PC.

  13. SMAC: A soft MAC to reduce control overhead and latency in CDMA-based AMI networks

    DOE PAGES

    Garlapati, Shravan; Kuruganti, Teja; Buehrer, Michael R.; ...

    2015-10-26

    The utilization of state-of-the-art 3G cellular CDMA technologies in a utility owned AMI network results in a large amount of control traffic relative to data traffic, increases the average packet delay and hence are not an appropriate choice for smart grid distribution applications. Like the CDG, we consider a utility owned cellular like CDMA network for smart grid distribution applications and classify the distribution smart grid data as scheduled data and random data. Also, we propose SMAC protocol, which changes its mode of operation based on the type of the data being collected to reduce the data collection latency andmore » control overhead when compared to 3G cellular CDMA2000 MAC. The reduction in the data collection latency and control overhead aids in increasing the number of smart meters served by a base station within the periodic data collection interval, which further reduces the number of base stations needed by a utility or reduces the bandwidth needed to collect data from all the smart meters. The reduction in the number of base stations and/or the reduction in the data transmission bandwidth reduces the CAPital EXpenditure (CAPEX) and OPerational EXpenditure (OPEX) of the AMI network. Finally, the proposed SMAC protocol is analyzed using markov chain, analytical expressions for average throughput and average packet delay are derived, and simulation results are also provided to verify the analysis.« less

  14. Nano-JASMINE Data Analysis and Publication

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Hara, T.; Yoshioka, S.; Kobayashi, Y.; Gouda, N.; Miyashita, H.; Hatsutori, Y.; Lammers, U.; Michalik, D.

    2012-09-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). A collaboration between the Gaia AGIS and Nano-JASMINE teams on the Nano-JASMINE data reduction started in 2007. The Nano-JASMINE team writes codes to generate AGIS input, and this is called Initial Data Treament (IDT). Identification of observed stars and their observed field of view, getting color index, are different from those of Gaia because Nano-JASMINE is ultra small satellite. For converting centroiding results on detector to the celestial sphere, orbit and attitude data of the satellite are used. In Nano-JASMINE, orbit information is derived from on board GPS data and attitude is processed from on-board star sensor data and on-ground Kalman filtering. We also show the Nano-JASMINE goals, status of the data publications and utilizations, and introduce the next Japanese space astrometric mission.

  15. Scientific visualization of volumetric radar cross section data

    NASA Astrophysics Data System (ADS)

    Wojszynski, Thomas G.

    1992-12-01

    For aircraft design and mission planning, designers, threat analysts, mission planners, and pilots require a Radar Cross Section (RCS) central tendency with its associated distribution about a specified aspect and its relation to a known threat, Historically, RCS data sets have been statically analyzed to evaluate a d profile. However, Scientific Visualization, the application of computer graphics techniques to produce pictures of complex physical phenomena appears to be a more promising tool to interpret this data. This work describes data reduction techniques and a surface rendering algorithm to construct and display a complex polyhedron from adjacent contours of RCS data. Data reduction is accomplished by sectorizing the data and characterizing the statistical properties of the data. Color, lighting, and orientation cues are added to complete the visualization system. The tool may be useful for synthesis, design, and analysis of complex, low observable air vehicles.

  16. Risk factors of chronic periodontitis on healing response: a multilevel modelling analysis.

    PubMed

    Song, J; Zhao, H; Pan, C; Li, C; Liu, J; Pan, Y

    2017-09-15

    Chronic periodontitis is a multifactorial polygenetic disease with an increasing number of associated factors that have been identified over recent decades. Longitudinal epidemiologic studies have demonstrated that the risk factors were related to the progression of the disease. A traditional multivariate regression model was used to find risk factors associated with chronic periodontitis. However, the approach requirement of standard statistical procedures demands individual independence. Multilevel modelling (MLM) data analysis has widely been used in recent years, regarding thorough hierarchical structuring of the data, decomposing the error terms into different levels, and providing a new analytic method and framework for solving this problem. The purpose of our study is to investigate the relationship of clinical periodontal index and the risk factors in chronic periodontitis through MLM analysis and to identify high-risk individuals in the clinical setting. Fifty-four patients with moderate to severe periodontitis were included. They were treated by means of non-surgical periodontal therapy, and then made follow-up visits regularly at 3, 6, and 12 months after therapy. Each patient answered a questionnaire survey and underwent measurement of clinical periodontal parameters. Compared with baseline, probing depth (PD) and clinical attachment loss (CAL) improved significantly after non-surgical periodontal therapy with regular follow-up visits at 3, 6, and 12 months after therapy. The null model and variance component models with no independent variables included were initially obtained to investigate the variance of the PD and CAL reductions across all three levels, and they showed a statistically significant difference (P < 0.001), thus establishing that MLM data analysis was necessary. Site-level had effects on PD and CAL reduction; those variables could explain 77-78% of PD reduction and 70-80% of CAL reduction at 3, 6, and 12 months. Other levels only explain 20-30% of PD and CAL reductions. Site-level had the greatest effect on PD and CAL reduction. Non-surgical periodontal therapy with regular follow-up visits had a remarkable curative effect. All three levels had a substantial influence on the reduction of PD and CAL. Site-level had the largest effect on PD and CAL reductions.

  17. Toward automated denoising of single molecular Förster resonance energy transfer data

    NASA Astrophysics Data System (ADS)

    Lee, Hao-Chih; Lin, Bo-Lin; Chang, Wei-Hau; Tu, I.-Ping

    2012-01-01

    A wide-field two-channel fluorescence microscope is a powerful tool as it allows for the study of conformation dynamics of hundreds to thousands of immobilized single molecules by Förster resonance energy transfer (FRET) signals. To date, the data reduction from a movie to a final set containing meaningful single-molecule FRET (smFRET) traces involves human inspection and intervention at several critical steps, greatly hampering the efficiency at the post-imaging stage. To facilitate the data reduction from smFRET movies to smFRET traces and to address the noise-limited issues, we developed a statistical denoising system toward fully automated processing. This data reduction system has embedded several novel approaches. First, as to background subtraction, high-order singular value decomposition (HOSVD) method is employed to extract spatial and temporal features. Second, to register and map the two color channels, the spots representing bleeding through the donor channel to the acceptor channel are used. Finally, correlation analysis and likelihood ratio statistic for the change point detection (CPD) are developed to study the two channels simultaneously, resolve FRET states, and report the dwelling time of each state. The performance of our method has been checked using both simulation and real data.

  18. Nonlinear dimensionality reduction of data lying on the multicluster manifold.

    PubMed

    Meng, Deyu; Leung, Yee; Fung, Tung; Xu, Zongben

    2008-08-01

    A new method, which is called decomposition-composition (D-C) method, is proposed for the nonlinear dimensionality reduction (NLDR) of data lying on the multicluster manifold. The main idea is first to decompose a given data set into clusters and independently calculate the low-dimensional embeddings of each cluster by the decomposition procedure. Based on the intercluster connections, the embeddings of all clusters are then composed into their proper positions and orientations by the composition procedure. Different from other NLDR methods for multicluster data, which consider associatively the intracluster and intercluster information, the D-C method capitalizes on the separate employment of the intracluster neighborhood structures and the intercluster topologies for effective dimensionality reduction. This, on one hand, isometrically preserves the rigid-body shapes of the clusters in the embedding process and, on the other hand, guarantees the proper locations and orientations of all clusters. The theoretical arguments are supported by a series of experiments performed on the synthetic and real-life data sets. In addition, the computational complexity of the proposed method is analyzed, and its efficiency is theoretically analyzed and experimentally demonstrated. Related strategies for automatic parameter selection are also examined.

  19. Decentralized Dimensionality Reduction for Distributed Tensor Data Across Sensor Networks.

    PubMed

    Liang, Junli; Yu, Guoyang; Chen, Badong; Zhao, Minghua

    2016-11-01

    This paper develops a novel decentralized dimensionality reduction algorithm for the distributed tensor data across sensor networks. The main contributions of this paper are as follows. First, conventional centralized methods, which utilize entire data to simultaneously determine all the vectors of the projection matrix along each tensor mode, are not suitable for the network environment. Here, we relax the simultaneous processing manner into the one-vector-by-one-vector (OVBOV) manner, i.e., determining the projection vectors (PVs) related to each tensor mode one by one. Second, we prove that in the OVBOV manner each PV can be determined without modifying any tensor data, which simplifies corresponding computations. Third, we cast the decentralized PV determination problem as a set of subproblems with consensus constraints, so that it can be solved in the network environment only by local computations and information communications among neighboring nodes. Fourth, we introduce the null space and transform the PV determination problem with complex orthogonality constraints into an equivalent hidden convex one without any orthogonality constraint, which can be solved by the Lagrange multiplier method. Finally, experimental results are given to show that the proposed algorithm is an effective dimensionality reduction scheme for the distributed tensor data across the sensor networks.

  20. Epidemiological data on US coal miners' pneumoconiosis, 1960 to 1988.

    PubMed

    Attfield, M D; Castellan, R M

    1992-07-01

    Statistics on prevalence of pneumoconiosis among working underground coal miners based on epidemiologic data collected between 1960 and 1988 are presented. The main intent was to examine the time-related trend in prevalence, particularly after 1969, when substantially lower dust levels were mandated by federal act. Data from studies undertaken between 1960 and 1968 were collected and compared. Information for the period 1969 to 1988 was extracted from a large ongoing national epidemiologic study. Tenure-specific prevalence rates and summary statistics derived from the latter data for four consecutive time intervals within the 19-year period were calculated and compared. The results indicate a reduction in pneumoconiosis over time. The trend is similar to that seen in a large radiologic surveillance program of underground miners operated concurrently. Although such factors as x-ray reader variation, changes in x-ray standards, and worker self-selection for examination may have influenced the findings to some extent, adjusted summary rates reveal a reduction in prevalence concurrent with reductions in coal mine dust levels mandated by federal act in 1969.

  1. Methods to Approach Velocity Data Reduction and Their Effects on Conformation Statistics in Viscoelastic Turbulent Channel Flows

    NASA Astrophysics Data System (ADS)

    Samanta, Gaurab; Beris, Antony; Handler, Robert; Housiadas, Kostas

    2009-03-01

    Karhunen-Loeve (KL) analysis of DNS data of viscoelastic turbulent channel flows helps us to reveal more information on the time-dependent dynamics of viscoelastic modification of turbulence [Samanta et. al., J. Turbulence (in press), 2008]. A selected set of KL modes can be used for a data reduction modeling of these flows. However, it is pertinent that verification be done against established DNS results. For this purpose, we did comparisons of velocity and conformations statistics and probability density functions (PDFs) of relevant quantities obtained from DNS and reconstructed fields using selected KL modes and time-dependent coefficients. While the velocity statistics show good agreement between results from DNS and KL reconstructions even with just hundreds of KL modes, tens of thousands of KL modes are required to adequately capture the trace of polymer conformation resulting from DNS. New modifications to KL method have therefore been attempted to account for the differences in conformation statistics. The applicability and impact of these new modified KL methods will be discussed in the perspective of data reduction modeling.

  2. CFHT data processing and calibration ESPaDOnS pipeline: Upena and OPERA (optical spectropolarimetry)

    NASA Astrophysics Data System (ADS)

    Martioli, Eder; Teeple, D.; Manset, Nadine

    2011-03-01

    CFHT is ESPaDOnS responsible for processing raw images, removing instrument related artifacts, and delivering science-ready data to the PIs. Here we describe the Upena pipeline, which is the software used to reduce the echelle spectro-polarimetric data obtained with the ESPaDOnS instrument. Upena is an automated pipeline that performs calibration and reduction of raw images. Upena has the capability of both performing real-time image-by-image basis reduction and a post observing night complete reduction. Upena produces polarization and intensity spectra in FITS format. The pipeline is designed to perform parallel computing for improved speed, which assures that the final products are delivered to the PIs before noon HST after each night of observations. We also present the OPERA project, which is an open-source pipeline to reduce ESPaDOnS data that will be developed as a collaborative work between CFHT and the scientific community. OPERA will match the core capabilities of Upena and in addition will be open-source, flexible and extensible.

  3. Identification of differences in health impact modelling of salt reduction

    PubMed Central

    Geleijnse, Johanna M.; van Raaij, Joop M. A.; Cappuccio, Francesco P.; Cobiac, Linda C.; Scarborough, Peter; Nusselder, Wilma J.; Jaccard, Abbygail; Boshuizen, Hendriek C.

    2017-01-01

    We examined whether specific input data and assumptions explain outcome differences in otherwise comparable health impact assessment models. Seven population health models estimating the impact of salt reduction on morbidity and mortality in western populations were compared on four sets of key features, their underlying assumptions and input data. Next, assumptions and input data were varied one by one in a default approach (the DYNAMO-HIA model) to examine how it influences the estimated health impact. Major differences in outcome were related to the size and shape of the dose-response relation between salt and blood pressure and blood pressure and disease. Modifying the effect sizes in the salt to health association resulted in the largest change in health impact estimates (33% lower), whereas other changes had less influence. Differences in health impact assessment model structure and input data may affect the health impact estimate. Therefore, clearly defined assumptions and transparent reporting for different models is crucial. However, the estimated impact of salt reduction was substantial in all of the models used, emphasizing the need for public health actions. PMID:29182636

  4. Redundancy and Reduction: Speakers Manage Syntactic Information Density

    ERIC Educational Resources Information Center

    Jaeger, T. Florian

    2010-01-01

    A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel…

  5. Impact of presentation of research results on likelihood of prescribing medications to patients with left ventricular dysfunction.

    PubMed

    Lacy, C R; Barone, J A; Suh, D C; Malini, P L; Bueno, M; Moylan, D M; Kostis, J B

    2001-01-15

    This study was conducted to evaluate willingness to prescribe medication based on identical data presented in different outcome terms to health professionals of varied discipline, geographic location, and level of training. Cross-sectional survey using a self-administered questionnaire was performed in 400 health professionals (physicians, pharmacists, physicians-in-training, and pharmacy students) in the United States and Europe. Data reflecting a clinical trial were presented in 6 outcome terms: 3 terms describing identical mortality (relative risk reduction, absolute risk reduction, and number of patients needed to be treated to prevent 1 death); and 3 distractors (increased life expectancy, decreased hospitalization rate, and decreased cost). Willingness to prescribe and rank order of medication preference assuming willingness to prescribe were measured. The results of the study showed that willingness to prescribe and first choice preference were significantly greater when study results were presented as relative risk reduction than when identical mortality data were presented as absolute risk reduction or number of patients needed to be treated to avoid 1 death (p <0.001). Increase in life expectancy was the most influential distractor. In conclusion, this study, performed in the era of "evidence-based medicine," demonstrates that the method of reporting research trial results has significant influence on health professionals' willingness to prescribe. The high numerical value of relative risk reduction and the concrete and tangible quality of increased life expectancy exert greater influence on health professionals than other standard outcome terms.

  6. An adaptive band selection method for dimension reduction of hyper-spectral remote sensing image

    NASA Astrophysics Data System (ADS)

    Yu, Zhijie; Yu, Hui; Wang, Chen-sheng

    2014-11-01

    Hyper-spectral remote sensing data can be acquired by imaging the same area with multiple wavelengths, and it normally consists of hundreds of band-images. Hyper-spectral images can not only provide spatial information but also high resolution spectral information, and it has been widely used in environment monitoring, mineral investigation and military reconnaissance. However, because of the corresponding large data volume, it is very difficult to transmit and store Hyper-spectral images. Hyper-spectral image dimensional reduction technique is desired to resolve this problem. Because of the High relation and high redundancy of the hyper-spectral bands, it is very feasible that applying the dimensional reduction method to compress the data volume. This paper proposed a novel band selection-based dimension reduction method which can adaptively select the bands which contain more information and details. The proposed method is based on the principal component analysis (PCA), and then computes the index corresponding to every band. The indexes obtained are then ranked in order of magnitude from large to small. Based on the threshold, system can adaptively and reasonably select the bands. The proposed method can overcome the shortcomings induced by transform-based dimension reduction method and prevent the original spectral information from being lost. The performance of the proposed method has been validated by implementing several experiments. The experimental results show that the proposed algorithm can reduce the dimensions of hyper-spectral image with little information loss by adaptively selecting the band images.

  7. Trends and Patterns in Unintentional Injury Fatalities in Australian Agriculture.

    PubMed

    Lower, Tony; Rolfe, Margaret; Monaghan, Noeline

    2017-04-26

    Agriculture is recognized internationally as a hazardous industry. This article describes the trends and patterns of unintentional farm fatalities in Australia. Data from the National Coronial Information System were analyzed to assess all unintentional farm fatalities for the 2001-2015 period. A secondary comparison with earlier coronial system data from 1989-1992 was also completed to ascertain historical changes. There was no statistically significant change in the rate of work-related fatalities per 100,000 workers in the 2001-2015 period. However, there was a significant curvilinear reduction in all cases of fatality (work and non-work related) per 10,000 agricultural establishments, which decreased from 2001 to 2009-2011 and then increased to 2015. The longer-term data from 1989-2015 revealed a reduction of 30% in work-related cases per 100,000 workers and a reduction of 35% in all cases (work and non-work) per 10,000 agricultural establishments. For both work-related and all cases, there was a statistically significant reduction from 1989 to 2005 and then no change thereafter. The longer-term reduction in farm fatalities ceased in the mid-2000s, and the rate has remained stable since. Fatal injuries continue to impose a significant burden on Australian farming communities, with the rate remaining relatively static for the past ten years. New evidence-based interventions targeting priority areas are required to reduce the incidence of fatalities in Australia agriculture. Copyright© by the American Society of Agricultural Engineers.

  8. The Speckle Toolbox: A Powerful Data Reduction Tool for CCD Astrometry

    NASA Astrophysics Data System (ADS)

    Harshaw, Richard; Rowe, David; Genet, Russell

    2017-01-01

    Recent advances in high-speed low-noise CCD and CMOS cameras, coupled with breakthroughs in data reduction software that runs on desktop PCs, has opened the domain of speckle interferometry and high-accuracy CCD measurements of double stars to amateurs, allowing them to do useful science of high quality. This paper describes how to use a speckle interferometry reduction program, the Speckle Tool Box (STB), to achieve this level of result. For over a year the author (Harshaw) has been using STB (and its predecessor, Plate Solve 3) to obtain measurements of double stars based on CCD camera technology for pairs that are either too wide (the stars not sharing the same isoplanatic patch, roughly 5 arc-seconds in diameter) or too faint to image in the coherence time required for speckle (usually under 40ms). This same approach - using speckle reduction software to measure CCD pairs with greater accuracy than possible with lucky imaging - has been used, it turns out, for several years by the U. S. Naval Observatory.

  9. Regulation of interleukin-4 signaling by extracellular reduction of intramolecular disulfides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curbo, Sophie; Gaudin, Raphael; Carlsten, Mattias

    2009-12-25

    Interleukin-4 (IL-4) contains three structurally important intramolecular disulfides that are required for the bioactivity of the cytokine. We show that the cell surface of HeLa cells and endotoxin-activated monocytes can reduce IL-4 intramolecular disulfides in the extracellular space and inhibit binding of IL-4 to the IL-4R{alpha} receptor. IL-4 disulfides were in vitro reduced by thioredoxin 1 (Trx1) and protein disulfide isomerase (PDI). Reduction of IL-4 disulfides by the cell surface of HeLa cells was inhibited by auranofin, an inhibitor of thioredoxin reductase that is an electron donor to both Trx1 and PDI. Both Trx1 and PDI have been shown tomore » be located at the cell surface and our data suggests that these enzymes are involved in catalyzing reduction of IL-4 disulfides. The pro-drug N-acetylcysteine (NAC) that promotes T-helper type 1 responses was also shown to mediate the reduction of IL-4 disulfides. Our data provides evidence for a novel redox dependent pathway for regulation of cytokine activity by extracellular reduction of intramolecular disulfides at the cell surface by members of the thioredoxin enzyme family.« less

  10. Longitudinal change mechanisms for substance use and illegal activity for adolescents in treatment.

    PubMed

    Hunter, Brooke D; Godley, Susan H; Hesson-McInnis, Matthew S; Roozen, Hendrik G

    2014-06-01

    The current study investigated: (a) the relationships of exposure to the Adolescent Community Reinforcement Approach (A-CRA) with reductions in substance use, illegal activity, and juvenile justice system involvement in adolescents diagnosed with a substance use disorder, and (b) the pathways by which reductions in the target behaviors were achieved. This study is a secondary data analysis of longitudinal data from a large-scale implementation effort for A-CRA. The sample consisted of 1,467 adolescents who presented to substance use treatment and reported past-year engagement in illegal activity. Participants had an average age of 15.8 years (SD = 1.3) and were 25% female, 14% African American, 29% Hispanic, 35% Caucasian, 16% mixed ethnicity, and 6% other ethnicity. Path analyses provided support that participation in A-CRA had a significant, direct association with reduced substance use; a significant, indirect association with reduced illegal activity through reductions in substance use; and a significant indirect association with reduced juvenile justice system involvement through reductions in both substance use and illegal activity. In addition, post hoc analyses using a bootstrapping strategy provided evidence that reductions in substance use partially mediated the relationship between A-CRA and illegal activity.

  11. Travel time data collection handbook

    DOT National Transportation Integrated Search

    1998-03-01

    This Travel Time Data Collection Handbook provides guidance to transportation : professionals and practitioners for the collection, reduction, and presentation : of travel time data. The handbook should be a useful reference for designing : travel ti...

  12. Intelligent data reduction for autonomous power systems

    NASA Technical Reports Server (NTRS)

    Floyd, Stephen A.

    1988-01-01

    Since 1984 Marshall Space Flight Center was actively engaged in research and development concerning autonomous power systems. Much of the work in this domain has dealt with the development and application of knowledge-based or expert systems to perform tasks previously accomplished only through intensive human involvement. One such task is the health status monitoring of electrical power systems. Such monitoring is a manpower intensive task which is vital to mission success. The Hubble Space Telescope testbed and its associated Nickel Cadmium Battery Expert System (NICBES) were designated as the system on which the initial proof of concept for intelligent power system monitoing will be established. The key function performed by an engineer engaged in system monitoring is to analyze the raw telemetry data and identify from the whole only those elements which can be considered significant. This function requires engineering expertise on the functionality of the system, the mode of operation and the efficient and effective reading of the telemetry data. Application of this expertise to extract the significant components of the data is referred to as data reduction. Such a function possesses characteristics which make it a prime candidate for the application of knowledge-based systems' technologies. Such applications are investigated and recommendations are offered for the development of intelligent data reduction systems.

  13. Online dimensionality reduction using competitive learning and Radial Basis Function network.

    PubMed

    Tomenko, Vladimir

    2011-06-01

    The general purpose dimensionality reduction method should preserve data interrelations at all scales. Additional desired features include online projection of new data, processing nonlinearly embedded manifolds and large amounts of data. The proposed method, called RBF-NDR, combines these features. RBF-NDR is comprised of two modules. The first module learns manifolds by utilizing modified topology representing networks and geodesic distance in data space and approximates sampled or streaming data with a finite set of reference patterns, thus achieving scalability. Using input from the first module, the dimensionality reduction module constructs mappings between observation and target spaces. Introduction of specific loss function and synthesis of the training algorithm for Radial Basis Function network results in global preservation of data structures and online processing of new patterns. The RBF-NDR was applied for feature extraction and visualization and compared with Principal Component Analysis (PCA), neural network for Sammon's projection (SAMANN) and Isomap. With respect to feature extraction, the method outperformed PCA and yielded increased performance of the model describing wastewater treatment process. As for visualization, RBF-NDR produced superior results compared to PCA and SAMANN and matched Isomap. For the Topic Detection and Tracking corpus, the method successfully separated semantically different topics. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. SPHARA--a generalized spatial Fourier analysis for multi-sensor systems with non-uniformly arranged sensors: application to EEG.

    PubMed

    Graichen, Uwe; Eichardt, Roland; Fiedler, Patrique; Strohmeier, Daniel; Zanow, Frank; Haueisen, Jens

    2015-01-01

    Important requirements for the analysis of multichannel EEG data are efficient techniques for signal enhancement, signal decomposition, feature extraction, and dimensionality reduction. We propose a new approach for spatial harmonic analysis (SPHARA) that extends the classical spatial Fourier analysis to EEG sensors positioned non-uniformly on the surface of the head. The proposed method is based on the eigenanalysis of the discrete Laplace-Beltrami operator defined on a triangular mesh. We present several ways to discretize the continuous Laplace-Beltrami operator and compare the properties of the resulting basis functions computed using these discretization methods. We apply SPHARA to somatosensory evoked potential data from eleven volunteers and demonstrate the ability of the method for spatial data decomposition, dimensionality reduction and noise suppression. When employing SPHARA for dimensionality reduction, a significantly more compact representation can be achieved using the FEM approach, compared to the other discretization methods. Using FEM, to recover 95% and 99% of the total energy of the EEG data, on average only 35% and 58% of the coefficients are necessary. The capability of SPHARA for noise suppression is shown using artificial data. We conclude that SPHARA can be used for spatial harmonic analysis of multi-sensor data at arbitrary positions and can be utilized in a variety of other applications.

  15. Microdensitometer errors: Their effect on photometric data reduction

    NASA Technical Reports Server (NTRS)

    Bozyan, E. P.; Opal, C. B.

    1984-01-01

    The performance of densitometers used for photometric data reduction of high dynamic range electrographic plate material is analyzed. Densitometer repeatability is tested by comparing two scans of one plate. Internal densitometer errors are examined by constructing histograms of digitized densities and finding inoperative bits and differential nonlinearity in the analog to digital converter. Such problems appear common to the four densitometers used in this investigation and introduce systematic algorithm dependent errors in the results. Strategies to improve densitometer performance are suggested.

  16. Speckle Interferometry with the OCA Kuhn 22" Telescope

    NASA Astrophysics Data System (ADS)

    Wasson, Rick

    2018-04-01

    Speckle interferometry measurements of double stars were made in 2015 and 2016, using the Kuhn 22-inch classical Cassegrain telescope of the Orange County Astronomers, a Point Grey Blackfly CMOS camera, and three interference filters. 272 observations are reported for 177 systems, with separations ranging from 0.29" to 2.9". Data reduction was by means of the REDUC and Speckle Tool Box programs. Equipment, observing procedures, calibration, data reduction, and analysis are described, and unusual results for 11 stars are discussed in detail.

  17. Reduction of Human Norovirus GI, GII, and Surrogates by Peracetic Acid and Monochloramine in Municipal Secondary Wastewater Effluent.

    PubMed

    Dunkin, Nathan; Weng, ShihChi; Coulter, Caroline G; Jacangelo, Joseph G; Schwab, Kellogg J

    2017-10-17

    The objective of this study was to characterize human norovirus (hNoV) GI and GII reductions during disinfection by peracetic acid (PAA) and monochloramine in secondary wastewater (WW) and phosphate buffer (PB) as assessed by reverse transcription-qPCR (RT-qPCR). Infectivity and RT-qPCR reductions are also presented for surrogate viruses murine norovirus (MNV) and bacteriophage MS2 under identical experimental conditions to aid in interpretation of hNoV molecular data. In WW, RT-qPCR reductions were less than 0.5 log 10 for all viruses at concentration-time (CT) values up to 450 mg-min/L except for hNoV GI, where 1 log 10 reduction was observed at CT values of less than 50 mg-min/L for monochloramine and 200 mg-min/L for PAA. In PB, hNoV GI and MNV exhibited comparable resistance to PAA and monochloramine with CT values for 2 log 10 RT-qPCR reduction between 300 and 360 mg-min/L. Less than 1 log 10 reduction was observed for MS2 and hNoV GII in PB at CT values for both disinfectants up to 450 mg-min/L. Our results indicate that hNoVs exhibit genogroup dependent resistance and that disinfection practices targeting hNoV GII will result in equivalent or greater reductions for hNoV GI. These data provide valuable comparisons between hNoV and surrogate molecular signals that can begin the process of informing regulators and engineers on WW treatment plant design and operational practices necessary to inactivate hNoVs.

  18. Effects of potential federal funding cuts on graduate medical education: results of a survey of designated institutional officials.

    PubMed

    Holt, Kathleen D; Miller, Rebecca S; Philibert, Ingrid; Nasca, Thomas J

    2014-03-01

    Proposed reductions in federal funding for physician education may affect the United States' ability to produce the number of physicians needed to provide care. Using a survey similar to that used by the ACGME in 2011, we assessed designated institutional officials' (DIOs) perceptions of the impact of potential GME funding reductions. In August 2013, we sent a survey link to all DIOs of ACGME-accredited institutions (N  =  678). A 9-item survey asked how future federal funding would affect the number of residency programs in their institutions under 4 different funding scenarios: stable funding, and reductions of 10%, 33%, and 50%. We also asked about changes in the number of residency positions during the last 2 years. The response rate was 47.9% (325 of 678 DIOs); respondents represent 58.9% of accredited institutions with more than 1 program. Most respondents reported no change or an increase under the stable funding scenario. Under a 33% funding reduction, an estimated 17 379 (14.8% of all current) positions would be lost, and a 50% reduction would result in a loss of 33 562 positions (28.6%). Primary care specialties (eg, family medicine, internal medicine) would be most affected under the greatest funding reductions. The findings of the 2013 survey are consistent with 2011 data, with DIOs projecting significant reductions in programs and positions under more severe budget cuts. DIO comments highlighted reduced optimism (compared to data obtained in 2011) about the effect of funding cuts and concerns about the impact of reductions on patient care and health care personnel at teaching institutions.

  19. Knowledge, attitudes and perception on dietary salt reduction of two communities in Grahamstown, South Africa.

    PubMed

    Mushoriwa, Fadzai; Townsend, Nick; Srinivas, Sunitha

    2017-03-01

    Dietary salt reduction has been identified as a cost effective way of addressing the global burden of non-communicable diseases (NCDs), particularly cardiovascular diseases. The World Health Organization has recommended three main strategies for achieving population-wide salt reduction in all member states: food reformulation, policies and consumer awareness campaigns. In 2013, the South African Ministry of Health announced the mandatory salt reduction legislation for the food manufacturing sector. These were set to come into effect on 30 June 2016. This decision was influenced by the need to reduce the incidence of NCDs and the fact that processed food is the source of 54% of the salt consumed in the South African diet. However, with discretionary salt also being a significant contributor, there is need for consumer awareness campaigns. The aim of this study was to assess the knowledge, attitudes and practices of guardians and cooks at two non-governmental organisations based in Grahamstown, South Africa, towards dietary salt reduction. Data was collected through observation and explorative, voice-recorded semi-structured interviews and transcribed data was analysed using NVivo®. At both centres, salt shakers were not placed on the tables during mealtimes. Only 14% the participants perceived their personal salt intake to be a little. No participants were aware of the recommended daily salt intake limit or the relationship between salt and sodium. Only five out of the 19 participants had previously received information on dietary salt reduction from sources such as healthcare professionals and the media. The results from the first phase of this study highlighted gaps in the participants' knowledge, attitudes and practices towards dietary salt reduction. The aim of the second phase of the research is to design and implement a context specific and culturally appropriate educational intervention on dietary salt reduction.

  20. Projected Impact of a Sodium Consumption Reduction Initiative in Argentina: An Analysis from the CVD Policy Model – Argentina

    PubMed Central

    Konfino, Jonatan; Mekonnen, Tekeshe A.; Coxson, Pamela G.; Ferrante, Daniel; Bibbins-Domingo, Kirsten

    2013-01-01

    Background Cardiovascular disease (CVD) is the leading cause of death in adults in Argentina. Sodium reduction policies targeting processed foods were implemented in 2011 in Argentina, but the impact has not been evaluated. The aims of this study are to use Argentina-specific data on sodium excretion and project the impact of Argentina’s sodium reduction policies under two scenarios - the 2-year intervention currently being undertaken or a more persistent 10 year sodium reduction strategy. Methods We used Argentina-specific data on sodium excretion by sex and projected the impact of the current strategy on sodium consumption and blood pressure decrease. We assessed the projected impact of sodium reduction policies on CVD using the Cardiovascular Disease (CVD) Policy Model, adapted to Argentina, modeling two alternative policy scenarios over the next decade. Results Our study finds that the initiative to reduce sodium consumption currently in place in Argentina will have substantial impact on CVD over the next 10 years. Under the current proposed policy of 2-year sodium reduction, the mean sodium consumption is projected to decrease by 319–387 mg/day. This decrease is expected to translate into an absolute reduction of systolic blood pressure from 0.93 mmHg to 1.81 mmHg. This would avert about 19,000 all-cause mortality, 13,000 total myocardial infarctions, and 10,000 total strokes over the next decade. A more persistent sodium reduction strategy would yield even greater CVD benefits. Conclusion The impact of the Argentinean initiative would be effective in substantially reducing mortality and morbidity from CVD. This paper provides evidence-based support to continue implementing strategies to reduce sodium consumption at a population level. PMID:24040085

  1. Optimal neighborhood indexing for protein similarity search.

    PubMed

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-12-16

    Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website http://bioinfo.lifl.fr/reblosum. We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction.

  2. Optimal neighborhood indexing for protein similarity search

    PubMed Central

    Peterlongo, Pierre; Noé, Laurent; Lavenier, Dominique; Nguyen, Van Hoa; Kucherov, Gregory; Giraud, Mathieu

    2008-01-01

    Background Similarity inference, one of the main bioinformatics tasks, has to face an exponential growth of the biological data. A classical approach used to cope with this data flow involves heuristics with large seed indexes. In order to speed up this technique, the index can be enhanced by storing additional information to limit the number of random memory accesses. However, this improvement leads to a larger index that may become a bottleneck. In the case of protein similarity search, we propose to decrease the index size by reducing the amino acid alphabet. Results The paper presents two main contributions. First, we show that an optimal neighborhood indexing combining an alphabet reduction and a longer neighborhood leads to a reduction of 35% of memory involved into the process, without sacrificing the quality of results nor the computational time. Second, our approach led us to develop a new kind of substitution score matrices and their associated e-value parameters. In contrast to usual matrices, these matrices are rectangular since they compare amino acid groups from different alphabets. We describe the method used for computing those matrices and we provide some typical examples that can be used in such comparisons. Supplementary data can be found on the website . Conclusion We propose a practical index size reduction of the neighborhood data, that does not negatively affect the performance of large-scale search in protein sequences. Such an index can be used in any study involving large protein data. Moreover, rectangular substitution score matrices and their associated statistical parameters can have applications in any study involving an alphabet reduction. PMID:19087280

  3. Estimation of Broadband Shock Noise Reduction in Turbulent Jets by Water Injection

    NASA Technical Reports Server (NTRS)

    Kandula, Max; Lonerjan, Michael J.

    2008-01-01

    The concept of effective jet properties introduced by the authors (AIAA-2007-3645) has been extended to the estimation of broadband shock noise reduction by water injection in supersonic jets. Comparison of the predictions with the test data for cold underexpanded supersonic nozzles shows a satisfactory agreement. The results also reveal the range of water mass flow rates over which saturation of mixing noise reduction and existence of parasitic noise are manifest.

  4. Reduction in Tribological Energy Losses in the Transportation and Electric Utilities Sectors

    DTIC Science & Technology

    1985-09-01

    detailed a flow tree as possible. A major difficulty here lies in the fact that such detailed data are not always available or known. Another is that...Followers 14% Spring Load Reduction Same + Rolling Element Fulcrum Bearing Camshaft 0 FIGURE 4.27. Type IV 500 1000 1500 2000 2500 3000... Camshaft rpm Valve Train Friction Reduction with Various Improvements reduced the valve train torque 82% at 1500 rpm and the projected vehicle eco

  5. An Experimental Investigation into NO sub X Control of a Gas Turbine Combustor and Augmentor Tube Incorporating a Catalytic Reduction System

    DTIC Science & Technology

    1990-03-01

    An initial experimental investigation was conducted to examine the feasibility of NOx emission control using catalytic reduction techniques in the ...current configuration impractical. Recommendations for alternative configurations are presented. The results of the investigation have proven that further study is warranted....used as a gas generator and catalytic reduction system. Four data runs were made. Three runs were completed without the catalyst installed

  6. Waste reduction possibilities for manufacturing systems in the industry 4.0

    NASA Astrophysics Data System (ADS)

    Tamás, P.; Illés, B.; Dobos, P.

    2016-11-01

    The industry 4.0 creates some new possibilities for the manufacturing companies’ waste reduction for example by appearance of the cyber physical systems and the big data concept and spreading the „Internet of things (IoT)”. This paper presents in details the fourth industrial revolutions’ more important achievements and tools. In addition there will be also numerous new research directions in connection with the waste reduction possibilities of the manufacturing systems outlined.

  7. Analysis of NSWC Ocean EM Observatory Test Data

    DTIC Science & Technology

    2016-09-01

    deployment locations. 1S. SUBJECT TERMS magnetic anomaly detection (MAD), oceanographic magnetic fields, coherence, magnetic noise reduction 16...analyses ......................................................................................... 11 3. Analysis of magnetic data...37 Appendix B: Feb 11 underwater magnetic data

  8. Nitrogenase of Klebsiella pneumoniae. Hydrazine is a product of azide reduction.

    PubMed Central

    Dilworth, M J; Thorneley, R N

    1981-01-01

    Klebsiella pneumoniae nitrogenase reduced azide, at 30 degrees C and pH 6.8-8.2, to yield ammonia (NH3), dinitrogen (N2) and hydrazine (N2H4). Reduction of (15N = 14N = 14N)-followed by mass-spectrometric analysis showed that no new nitrogen-nitrogen bonds were formed. During azide reduction, added 15N2H4 did not contribute 15N to NH3, indicating lack of equilibration between enzyme-bound intermediates giving rise to N2H4 and N2H4 in solution. When azide reduction to N2H4 was partially inhibited by 15N2, label appeared in NH3 but not in N2H4. Product balances combined with the labelling data indicate that azide is reduced according to the following equations: (formula: see text); N2 was a competitive inhibitor and CO a non-competitive inhibitor of azide reduction to N2H4. The percentage of total electron flux used for H2 evolution concomitant with azide reduction fell from 26% at pH 6.8 to 0% at pH 8.2. Pre-steady-state kinetic data suggest that N2H4 is formed by the cleavage of the alpha-beta nitrogen-nitrogen bond to bound azide to leave a nitride (= N) intermediate that subsequently yields NH3. PMID:7030315

  9. Targeted Assessment for Prevention of Healthcare-Associated Infections: A New Prioritization Metric.

    PubMed

    Soe, Minn M; Gould, Carolyn V; Pollock, Daniel; Edwards, Jonathan

    2015-12-01

    To develop a method for calculating the number of healthcare-associated infections (HAIs) that must be prevented to reach a HAI reduction goal and identifying and prioritizing healthcare facilities where the largest reductions can be achieved. Acute care hospitals that report HAI data to the Centers for Disease Control and Prevention's National Healthcare Safety Network. METHODS :The cumulative attributable difference (CAD) is calculated by subtracting a numerical prevention target from an observed number of HAIs. The prevention target is the product of the predicted number of HAIs and a standardized infection ratio goal, which represents a HAI reduction goal. The CAD is a numeric value that if positive is the number of infections to prevent to reach the HAI reduction goal. We calculated the CAD for catheter-associated urinary tract infections for each of the 3,639 hospitals that reported such data to National Healthcare Safety Network in 2013 and ranked the hospitals by their CAD values in descending order. Of 1,578 hospitals with positive CAD values, preventing 10,040 catheter-associated urinary tract infections at 293 hospitals (19%) with the highest CAD would enable achievement of the national 25% catheter-associated urinary tract infection reduction goal. The CAD is a new metric that facilitates ranking of facilities, and locations within facilities, to prioritize HAI prevention efforts where the greatest impact can be achieved toward a HAI reduction goal.

  10. Assessment of In-Situ Reductive Dechlorination Using Compound-Specific Stable Isotopes, Functional-Gene Pcr, and Geochemical Data

    PubMed Central

    Carreón-Diazconti, Concepción; Santamaría, Johanna; Berkompas, Justin; Field, James A.; Brusseau, Mark L.

    2010-01-01

    Isotopic analysis and molecular-based bioassay methods were used in conjunction with geochemical data to assess intrinsic reductive dechlorination processes for a chlorinated-solvent contaminated site in Tucson, Arizona. Groundwater samples were obtained from monitoring wells within a contaminant plume comprising tetrachloroethene and its metabolites trichloroethene, cis-1,2-dichloroethene, vinyl chloride, and ethene, as well as compounds associated with free-phase diesel present at the site. Compound specific isotope (CSI) analysis was performed to characterize biotransformation processes influencing the transport and fate of the chlorinated contaminants. PCR analysis was used to assess the presence of indigenous reductive dechlorinators. The target regions employed were the 16s rRNA gene sequences of Dehalococcoides sp. and Desulfuromonas sp., and DNA sequences of genes pceA, tceA, bvcA, and vcrA, which encode reductive dehalogenases. The results of the analyses indicate that relevant microbial populations are present and that reductive dechlorination is presently occurring at the site. The results further show that potential degrader populations as well as biotransformation activity is non-uniformly distributed within the site. The results of laboratory microcosm studies conducted using groundwater collected from the field site confirmed the reductive dechlorination of tetrachloroethene to dichloroethene. This study illustrates the use of an integrated, multiple-method approach for assessing natural attenuation at a complex chlorinated-solvent contaminated site. PMID:19603638

  11. Impact of a national nutritional support programme on loss to follow-up after tuberculosis diagnosis in Kenya.

    PubMed

    Mansour, O; Masini, E O; Kim, B-S J; Kamene, M; Githiomi, M M; Hanson, C L

    2018-06-01

    Undernourishment is prevalent among tuberculosis (TB) patients. Nutritional support is given to TB patients to prevent and treat undernourishment; it is also used to improve treatment outcomes and as an incentive to keep patients on treatment. To determine whether nutritional support is associated with a reduction in the risk of loss to follow-up (LTFU) among TB patients in Kenya. This was a retrospective cohort study using national programmatic data. Records of 362 685 drug-susceptible TB patients from 2012 to 2015 were obtained from Treatment Information from Basic Unit (TIBU), a national case-based electronic data recording system. Patients who were LTFU were compared with those who completed treatment. Nutrition counselling was associated with an 8% reduction in the risk of LTFU (RR 0.92, 95%CI 0.89-0.95), vitamins were associated with a 7% reduction (adjusted RR [aRR] 0.93, 95%CI 0.90-0.96) and food support was associated with a 10% reduction (aRR 0.90, 95%CI 0.87-0.94). Among patients who received food support, the addition of nutrition counselling was associated with a 23% reduction in the risk of LTFU (aRR 0.77, 95%CI 0.67-0.88). Nutritional support was associated with a reduction in the risk of LTFU. Providing nutrition counselling is important for patients receiving food support.

  12. Clean Cities 2013 Annual Metrics Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C.; Singer, M.

    2014-10-01

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programsmore » to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2013 Annual Metrics Report.« less

  13. Scientific data processing for Hipparcos

    NASA Astrophysics Data System (ADS)

    van der Marel, H.

    1988-04-01

    The scientific aims of the ESA Hipparcos astrometric satellite are reviewed, and the fundamental principles and practical implementation of the data-analysis and data-reduction procedures are discussed in detail. Hipparcos is to determine the positions and proper motions of a catalog of 110,000 stars to a limit of 12 mag with accuracy a few marcsec and obtain photometric observations of 400,000 stars (the Tycho mission). Consideration is given to the organization of the data-processing consortia FAST, NDAC, and TDAC; the basic problems of astrometry; the measurement principle; the large amounts of data to be generated during the 2.5-year mission; and the three-step iterative method to be applied (positional reconstruction and reduction to a reference great circle, spherical reconstruction, and extraction of the astrometric parameters). Diagrams and a flow chart are provided.

  14. Clean Cities 2014 Annual Metrics Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Caley; Singer, Mark

    Each year, the U.S. Department of Energy asks its Clean Cities program coordinators to submit annual reports of their activities and accomplishments for the previous calendar year. Data and information are submitted via an online database that is maintained as part of the Alternative Fuels Data Center (AFDC) at the National Renewable Energy Laboratory (NREL). Coordinators submit a range of data that characterize the membership, funding, projects, and activities of their coalitions. They also submit data about sales of alternative fuels, deployment of alternative fuel vehicles (AFVs) and hybrid electric vehicles (HEVs), idle-reduction (IR) initiatives, fuel economy activities, and programsmore » to reduce vehicle miles traveled (VMT). NREL analyzes the data and translates them into petroleum-use reduction impacts, which are summarized in this 2014 Annual Metrics Report.« less

  15. Development of a simple, self-contained flight test data acquisition system

    NASA Technical Reports Server (NTRS)

    Renz, R. R. L.

    1981-01-01

    A low cost flight test data acquisition system, applicable to general aviation airplanes, was developed which meets criteria for doing longitudinal and lateral stability analysis. Th package consists of (1) a microprocessor controller and data acquisition module; (2) a transducer module; and (3) a power supply module. The system is easy to install and occupies space in the cabin or baggage compartment of the airplane. All transducers are contained in these modules except the total pressure tube, static pressure air temperature transducer, and control position transducers. The NASA-developed MMLE program was placed on a microcomputer on which all data reduction is done. The flight testing program undertaken proved both the flight testing hardware and the data reduction method to be applicable to the current field of general aviation airplanes.

  16. Intraocular pressure reduction and regulation system

    NASA Technical Reports Server (NTRS)

    Baehr, E. F.; Burnett, J. E.; Felder, S. F.; Mcgannon, W. J.

    1979-01-01

    An intraocular pressure reduction and regulation system is described and data are presented covering performance in: (1) reducing intraocular pressure to a preselected value, (2) maintaining a set minimum intraocular pressure, and (3) reducing the dynamic increases in intraocular pressure resulting from external loads applied to the eye.

  17. Mechanisms of Symptom Reduction in Treatment for Obsessions

    ERIC Educational Resources Information Center

    Woody, Sheila R.; Whittal, Maureen L.; McLean, Peter D.

    2011-01-01

    Objective: We explored the dynamic relationship between cognition and obsession severity during 2 different treatments for primary obsessions, examining evidence for the hypothesis that symptom reduction would be mediated by appraisals about the meaning of unwanted intrusive thoughts. Method: Data from a recent randomized controlled trial were…

  18. Alternative Fuels Data Center

    Science.gov Websites

    ) provides U.S. Environmental Protection Agency Diesel Emissions Reduction Act (DERA) funding for projects that reduce diesel emissions in New Hampshire. Funding for between 25% and 100% of eligible project , or adding idle reduction equipment. For more information, including funding amounts and how to apply

  19. Developing and Evaluating a Cardiovascular Risk Reduction Project.

    ERIC Educational Resources Information Center

    Brownson, Ross C.; Mayer, Jeffrey P.; Dusseault, Patricia; Dabney, Sue; Wright, Kathleen; Jackson-Thompson, Jeannette; Malone, Bernard; Goodman, Robert

    1997-01-01

    Describes the development and baseline evaluation data from the Ozark Heart Health Project, a community-based cardiovascular disease risk reduction program in rural Missouri that targeted smoking, physical inactivity, and poor diet. Several Ozark counties participated in either intervention or control groups, and researchers conducted surveillance…

  20. Increasing conclusiveness of clinical breath analysis by improved baseline correction of multi capillary column - ion mobility spectrometry (MCC-IMS) data.

    PubMed

    Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C

    2016-08-05

    Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Data Reduction Approaches for Dissecting Transcriptional Effects on Metabolism

    PubMed Central

    Schwahn, Kevin; Nikoloski, Zoran

    2018-01-01

    The availability of high-throughput data from transcriptomics and metabolomics technologies provides the opportunity to characterize the transcriptional effects on metabolism. Here we propose and evaluate two computational approaches rooted in data reduction techniques to identify and categorize transcriptional effects on metabolism by combining data on gene expression and metabolite levels. The approaches determine the partial correlation between two metabolite data profiles upon control of given principal components extracted from transcriptomics data profiles. Therefore, they allow us to investigate both data types with all features simultaneously without doing preselection of genes. The proposed approaches allow us to categorize the relation between pairs of metabolites as being under transcriptional or post-transcriptional regulation. The resulting classification is compared to existing literature and accumulated evidence about regulatory mechanism of reactions and pathways in the cases of Escherichia coli, Saccharomycies cerevisiae, and Arabidopsis thaliana. PMID:29731765

  2. Radiant Heat Transfer Between Nongray Parallel Plates of Tungsten

    NASA Technical Reports Server (NTRS)

    Branstetter, J. Robert

    1961-01-01

    Net radiant heat flow between two infinite, parallel, tungsten plates was computed by summing the monochromatic energy exchange; the results are graphically presented as a function of the temperatures of the two surfaces. In general these fluxes range from approximately a to 25 percent greater than the results of gray-body computations based on the same emissivity data. The selection of spectral emissivity data and the computational procedure are discussed. The present analytical procedure is so arranged that, as spectral emissivity data for a material become available, these data can be readily introduced into the NASA data-reduction equipment, which has been programmed to compute the net heat flux for the particular geometry and basic assumptions cited in the text. Nongray-body computational techniques for determining radiant heat flux appear practical provided the combination of select spectral emissivity data and the proper mechanized data-reduction equipment are brought to bear on the problem.

  3. Manifold Learning in MR spectroscopy using nonlinear dimensionality reduction and unsupervised clustering.

    PubMed

    Yang, Guang; Raschke, Felix; Barrick, Thomas R; Howe, Franklyn A

    2015-09-01

    To investigate whether nonlinear dimensionality reduction improves unsupervised classification of (1) H MRS brain tumor data compared with a linear method. In vivo single-voxel (1) H magnetic resonance spectroscopy (55 patients) and (1) H magnetic resonance spectroscopy imaging (MRSI) (29 patients) data were acquired from histopathologically diagnosed gliomas. Data reduction using Laplacian eigenmaps (LE) or independent component analysis (ICA) was followed by k-means clustering or agglomerative hierarchical clustering (AHC) for unsupervised learning to assess tumor grade and for tissue type segmentation of MRSI data. An accuracy of 93% in classification of glioma grade II and grade IV, with 100% accuracy in distinguishing tumor and normal spectra, was obtained by LE with unsupervised clustering, but not with the combination of k-means and ICA. With (1) H MRSI data, LE provided a more linear distribution of data for cluster analysis and better cluster stability than ICA. LE combined with k-means or AHC provided 91% accuracy for classifying tumor grade and 100% accuracy for identifying normal tissue voxels. Color-coded visualization of normal brain, tumor core, and infiltration regions was achieved with LE combined with AHC. The LE method is promising for unsupervised clustering to separate brain and tumor tissue with automated color-coding for visualization of (1) H MRSI data after cluster analysis. © 2014 Wiley Periodicals, Inc.

  4. Alendronate for fracture prevention in postmenopause.

    PubMed

    Holder, Kathryn K; Kerley, Sara Shelton

    2008-09-01

    Osteoporosis is an abnormal reduction in bone mass and bone deterioration leading to increased fracture risk. Alendronate (Fosamax) belongs to the bisphosphonate class of drugs, which act to inhibit bone resorption by interfering with the activity of osteoclasts. To assess the effectiveness of alendronate in the primary and secondary prevention of osteoporotic fractures in postmenopausal women. The authors searched Central, Medline, and EMBASE for relevant randomized controlled trials published from 1966 to 2007. The authors undertook study selection and data abstraction in duplicate. The authors performed meta-analysis of fracture outcomes using relative risks, and a relative change greater than 15 percent was considered clinically important. The authors assessed study quality through reporting of allocation concealment, blinding, and withdrawals. Eleven trials representing 12,068 women were included in the review. Relative and absolute risk reductions for the 10-mg dose were as follows. For vertebral fractures, a 45 percent relative risk reduction was found (relative risk [RR] = 0.55; 95% confidence interval [CI], 0.45 to 0.67). This was significant for primary prevention, with a 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.38 to 0.80) and 2 percent absolute risk reduction; and for secondary prevention, with 45 percent relative risk reduction (RR = 0.55; 95% CI, 0.43 to 0.69) and 6 percent absolute risk reduction. For nonvertebral fractures, a 16 percent relative risk reduction was found (RR = 0.84; 95% CI, 0.74 to 0.94). This was significant for secondary prevention, with a 23 percent relative risk reduction (RR = 0.77; 95% CI, 0.64 to 0.92) and a 2 percent absolute risk reduction, but not for primary prevention (RR = 0.89; 95% CI, 0.76 to 1.04). There was a 40 percent relative risk reduction in hip fractures (RR = 0.60; 95% CI, 0.40 to 0.92), but only secondary prevention was significant, with a 53 percent relative risk reduction (RR = 0.47; 95% CI, 0.26 to 0.85) and a 1 percent absolute risk reduction. The only significance found for wrist fractures was in secondary prevention, with a 50 percent relative risk reduction (RR = 0.50; 95% CI, 0.34 to 0.73) and a 2 percent absolute risk reduction. For adverse events, the authors found no statistically significant difference in any included study. However, observational data raise concerns about potential risk for upper gastrointestinal injury and, less commonly, osteonecrosis of the jaw. At 10 mg of alendronate per day, clinically important and statistically significant reductions in vertebral, nonvertebral, hip, and wrist fractures were observed for secondary prevention. The authors found no statistically significant results for primary prevention, with the exception of vertebral fractures, for which the reduction was clinically important.

  5. Enhanced data reduction of the velocity data on CETA flight experiment. [Crew and Equipment Translation Aid

    NASA Technical Reports Server (NTRS)

    Finley, Tom D.; Wong, Douglas T.; Tripp, John S.

    1993-01-01

    A newly developed technique for enhanced data reduction provides an improved procedure that allows least squares minimization to become possible between data sets with an unequal number of data points. This technique was applied in the Crew and Equipment Translation Aid (CETA) experiment on the STS-37 Shuttle flight in April 1991 to obtain the velocity profile from the acceleration data. The new technique uses a least-squares method to estimate the initial conditions and calibration constants. These initial conditions are estimated by least-squares fitting the displacements indicated by the Hall-effect sensor data to the corresponding displacements obtained from integrating the acceleration data. The velocity and displacement profiles can then be recalculated from the corresponding acceleration data using the estimated parameters. This technique, which enables instantaneous velocities to be obtained from the test data instead of only average velocities at varying discrete times, offers more detailed velocity information, particularly during periods of large acceleration or deceleration.

  6. sTools - a data reduction pipeline for the GREGOR Fabry-Pérot Interferometer and the High-resolution Fast Imager at the GREGOR solar telescope

    NASA Astrophysics Data System (ADS)

    Kuckein, C.; Denker, C.; Verma, M.; Balthasar, H.; González Manrique, S. J.; Louis, R. E.; Diercke, A.

    2017-10-01

    A huge amount of data has been acquired with the GREGOR Fabry-Pérot Interferometer (GFPI), large-format facility cameras, and since 2016 with the High-resolution Fast Imager (HiFI). These data are processed in standardized procedures with the aim of providing science-ready data for the solar physics community. For this purpose, we have developed a user-friendly data reduction pipeline called ``sTools'' based on the Interactive Data Language (IDL) and licensed under creative commons license. The pipeline delivers reduced and image-reconstructed data with a minimum of user interaction. Furthermore, quick-look data are generated as well as a webpage with an overview of the observations and their statistics. All the processed data are stored online at the GREGOR GFPI and HiFI data archive of the Leibniz Institute for Astrophysics Potsdam (AIP). The principles of the pipeline are presented together with selected high-resolution spectral scans and images processed with sTools.

  7. Three Averaging Techniques for Reduction of Antenna Temperature Variance Measured by a Dicke Mode, C-Band Radiometer

    NASA Technical Reports Server (NTRS)

    Mackenzie, Anne I.; Lawrence, Roland W.

    2000-01-01

    As new radiometer technologies provide the possibility of greatly improved spatial resolution, their performance must also be evaluated in terms of expected sensitivity and absolute accuracy. As aperture size increases, the sensitivity of a Dicke mode radiometer can be maintained or improved by application of any or all of three digital averaging techniques: antenna data averaging with a greater than 50% antenna duty cycle, reference data averaging, and gain averaging. An experimental, noise-injection, benchtop radiometer at C-band showed a 68.5% reduction in Delta-T after all three averaging methods had been applied simultaneously. For any one antenna integration time, the optimum 34.8% reduction in Delta-T was realized by using an 83.3% antenna/reference duty cycle.

  8. AGS a set of UNIX commands for neutron data reduction

    NASA Astrophysics Data System (ADS)

    Bastian, C.

    1997-02-01

    The output of a detector system recording neutron-induced nuclear reactions consists of a set of multichannel spectra and of scaler/counter values. These data must be reduced - i.e.. corrected and combined - to produce a clean energy spectrum of the reaction cross-section with a covariance estimate suitable for evaluation. The reduction process may be broken down into a sequence of operations. We present a set of reduction operations implemented as commands on a UNIX system. Every operation reads spectra from a file and appends results as new spectra to the same file. The binary file format AGS used thereby records the spectra as named entities including a set of neutron energy values and a corresponding set of values with their correlated and uncorrelated uncertainties.

  9. Skin friction reduction in supersonic flow by injection through slots, porous sections and combinations of the two

    NASA Technical Reports Server (NTRS)

    Schetz, J. A.; Vanovereem, J.

    1975-01-01

    An experimental study of skin friction reduction in a Mach 3.0 air steam with gaseous injection through a tangential slot, a porous wall section, and combinations of the two was conducted. The primary data obtained were wall shear values measured directly with a floating element balance and also inferred from Preston Tube measurements. Detailed profiles at several axial stations, wall pressure distributions and schlieren photographs are presented. The data indicate that a slot provides the greatest skin friction reduction in comparison with a reference flat plate experiment. The porous wall section arrangement suffers from an apparent roughness-induced rise in skin friction at low injection rates compared to the flat plate. The combination schemes demonstrated a potential for gain.

  10. Importance of interatomic spacing in catalytic reduction of oxygen in phosphoric acid

    NASA Technical Reports Server (NTRS)

    Jalan, V.; Taylor, E. J.

    1983-01-01

    A correlation between the nearest-neighbor distance and the oxygen reduction activity of various platinum alloys is reported. It is proposed that the distance between nearest-neighbor Pt atoms on the surface of a supported catalyst is not ideal for dual site absorption of O2 or 'HO2' and that the introduction of foreign atoms which reduce the Pt nearest-neighbor spacing would result in higher oxygen reduction activity. This may allow the critical 0-0 bond interatomic distance and hence the optimum Pt-Pt separation for bond rupture to be determined from quantum chemical calculations. A composite analysis shows that the data on supported Pt alloys are consistent with Appleby's (1970) data on bulk metals with respect to specific activity, activation energy, preexponential factor, and percent d-band character.

  11. Flight flutter testing technology at Grumman. [automated telemetry station for on line data reduction

    NASA Technical Reports Server (NTRS)

    Perangelo, H. J.; Milordi, F. W.

    1976-01-01

    Analysis techniques used in the automated telemetry station (ATS) for on line data reduction are encompassed in a broad range of software programs. Concepts that form the basis for the algorithms used are mathematically described. The control the user has in interfacing with various on line programs is discussed. The various programs are applied to an analysis of flight data which includes unimodal and bimodal response signals excited via a swept frequency shaker and/or random aerodynamic forces. A nonlinear response error modeling analysis approach is described. Preliminary results in the analysis of a hard spring nonlinear resonant system are also included.

  12. A method for the measurement and analysis of ride vibrations of transportation systems

    NASA Technical Reports Server (NTRS)

    Catherines, J. J.; Clevenson, S. A.; Scholl, H. F.

    1972-01-01

    The measurement and recording of ride vibrations which affect passenger comfort in transportation systems and the subsequent data-reduction methods necessary for interpreting the data present exceptional instrumentation requirements and necessitate the use of computers for specialized analysis techniques. A method is presented for both measuring and analyzing ride vibrations of the type encountered in ground and air transportation systems. A portable system for measuring and recording low-frequency, low-amplitude accelerations and specialized data-reduction procedures are described. Sample vibration measurements in the form of statistical parameters representative of typical transportation systems are also presented to demonstrate the utility of the techniques.

  13. Echelle Data Reduction Cookbook

    NASA Astrophysics Data System (ADS)

    Clayton, Martin

    This document is the first version of the Starlink Echelle Data Reduction Cookbook. It contains scripts and procedures developed by regular or heavy users of the existing software packages. These scripts are generally of two types; templates which readers may be able to modify to suit their particular needs and utilities which carry out a particular common task and can probably be used `off-the-shelf'. In the nature of this subject the recipes given are quite strongly tied to the software packages, rather than being science-data led. The major part of this document is divided into two sections dealing with scripts to be used with IRAF and with Starlink software (SUN/1).

  14. Nano-JASMINE: use of AGIS for the next astrometric satellite

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Gouda, N.; Lammers, U.

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). The collaboration started at 2007 prompted by Uwe Lammers' proposal. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  15. An Eigensystem Realization Algorithm (ERA) for modal parameter identification and model reduction

    NASA Technical Reports Server (NTRS)

    Juang, J. N.; Pappa, R. S.

    1985-01-01

    A method, called the Eigensystem Realization Algorithm (ERA), is developed for modal parameter identification and model reduction of dynamic systems from test data. A new approach is introduced in conjunction with the singular value decomposition technique to derive the basic formulation of minimum order realization which is an extended version of the Ho-Kalman algorithm. The basic formulation is then transformed into modal space for modal parameter identification. Two accuracy indicators are developed to quantitatively identify the system modes and noise modes. For illustration of the algorithm, examples are shown using simulation data and experimental data for a rectangular grid structure.

  16. Data set on the effects of conifer control and slash burning on soil carbon, total N, organic matter and extractable micro-nutrients.

    PubMed

    Bates, Jonathan D; Davies, Kirk W

    2017-10-01

    Conifer control in sagebrush steppe of the western United States causes various levels of site disturbance influencing vegetation recovery and resource availability. The data set presented in this article include growing season availability of soil micronutrients and levels of total soil carbon, organic matter, and N spanning a six year period following western juniper ( Juniperus occidentalis spp. occidentalis ) reduction by mechanical cutting and prescribed fire of western juniper woodlands in southeast Oregon. These data can be useful to further evaluate the impacts of conifer woodland reduction to soil resources in sagebrush steppe plant communities.

  17. Numerical Prediction of Chevron Nozzle Noise Reduction using Wind-MGBK Methodology

    NASA Technical Reports Server (NTRS)

    Engblom, W.A.; Bridges, J.; Khavarant, A.

    2005-01-01

    Numerical predictions for single-stream chevron nozzle flow performance and farfield noise production are presented. Reynolds Averaged Navier Stokes (RANS) solutions, produced via the WIND flow solver, are provided as input to the MGBK code for prediction of farfield noise distributions. This methodology is applied to a set of sensitivity cases involving varying degrees of chevron inward bend angle relative to the core flow, for both cold and hot exhaust conditions. The sensitivity study results illustrate the effect of increased chevron bend angle and exhaust temperature on enhancement of fine-scale mixing, initiation of core breakdown, nozzle performance, and noise reduction. Direct comparisons with experimental data, including stagnation pressure and temperature rake data, PIV turbulent kinetic energy fields, and 90 degree observer farfield microphone data are provided. Although some deficiencies in the numerical predictions are evident, the correct farfield noise spectra trends are captured by the WIND-MGBK method, including the noise reduction benefit of chevrons. Implications of these results to future chevron design efforts are addressed.

  18. Artifact reduction in short-scan CBCT by use of optimization-based reconstruction

    PubMed Central

    Zhang, Zheng; Han, Xiao; Pearson, Erik; Pelizzari, Charles; Sidky, Emil Y; Pan, Xiaochuan

    2017-01-01

    Increasing interest in optimization-based reconstruction in research on, and applications of, cone-beam computed tomography (CBCT) exists because it has been shown to have to potential to reduce artifacts observed in reconstructions obtained with the Feldkamp–Davis–Kress (FDK) algorithm (or its variants), which is used extensively for image reconstruction in current CBCT applications. In this work, we carried out a study on optimization-based reconstruction for possible reduction of artifacts in FDK reconstruction specifically from short-scan CBCT data. The investigation includes a set of optimization programs such as the image-total-variation (TV)-constrained data-divergency minimization, data-weighting matrices such as the Parker weighting matrix, and objects of practical interest for demonstrating and assessing the degree of artifact reduction. Results of investigative work reveal that appropriately designed optimization-based reconstruction, including the image-TV-constrained reconstruction, can reduce significant artifacts observed in FDK reconstruction in CBCT with a short-scan configuration. PMID:27046218

  19. Utility Sector Impacts of Reduced Electricity Demand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coughlin, Katie

    2014-12-01

    This report presents a new approach to estimating the marginal utility sector impacts associated with electricity demand reductions. The method uses publicly available data and provides results in the form of time series of impact factors. The input data are taken from the Energy Information Agency's Annual Energy Outlook (AEO) projections of how the electric system might evolve in the reference case, and in a number of side cases that incorporate different effciency and other policy assumptions. The data published with the AEO are used to define quantitative relationships between demand-side electricity reductions by end use and supply-side changes tomore » capacity by plant type, generation by fuel type and emissions of CO2, Hg, NOx and SO2. The impact factors define the change in each of these quantities per unit reduction in site electricity demand. We find that the relative variation in these impacts by end use is small, but the time variation can be significant.« less

  20. Spectral Regression Discriminant Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Pan, Y.; Wu, J.; Huang, H.; Liu, J.

    2012-08-01

    Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant features, have attracted great attention for Hyperspectral Image Classification. The manifold learning methods are popular for dimensionality reduction, such as Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, a disadvantage of many manifold learning methods is that their computations usually involve eigen-decomposition of dense matrices which is expensive in both time and memory. In this paper, we introduce a new dimensionality reduction method, called Spectral Regression Discriminant Analysis (SRDA). SRDA casts the problem of learning an embedding function into a regression framework, which avoids eigen-decomposition of dense matrices. Also, with the regression based framework, different kinds of regularizes can be naturally incorporated into our algorithm which makes it more flexible. It can make efficient use of data points to discover the intrinsic discriminant structure in the data. Experimental results on Washington DC Mall and AVIRIS Indian Pines hyperspectral data sets demonstrate the effectiveness of the proposed method.

  1. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  2. An efficient algorithm using matrix methods to solve wind tunnel force-balance equations

    NASA Technical Reports Server (NTRS)

    Smith, D. L.

    1972-01-01

    An iterative procedure applying matrix methods to accomplish an efficient algorithm for automatic computer reduction of wind-tunnel force-balance data has been developed. Balance equations are expressed in a matrix form that is convenient for storing balance sensitivities and interaction coefficient values for online or offline batch data reduction. The convergence of the iterative values to a unique solution of this system of equations is investigated, and it is shown that for balances which satisfy the criteria discussed, this type of solution does occur. Methods for making sensitivity adjustments and initial load effect considerations in wind-tunnel applications are also discussed, and the logic for determining the convergence accuracy limits for the iterative solution is given. This more efficient data reduction program is compared with the technique presently in use at the NASA Langley Research Center, and computational times on the order of one-third or less are demonstrated by use of this new program.

  3. Energy reduction using multi-channels optical wireless communication based OFDM

    NASA Astrophysics Data System (ADS)

    Darwesh, Laialy; Arnon, Shlomi

    2017-10-01

    In recent years, an increasing number of data center networks (DCNs) have been built to provide various cloud applications. Major challenges in the design of next generation DC networks include reduction of the energy consumption, high flexibility and scalability, high data rates, minimum latency and high cyber security. Use of optical wireless communication (OWC) to augment the DC network could help to confront some of these challenges. In this paper we present an OWC multi channels communication method that could lead to significant energy reduction of the communication equipment. The method is to convert a high speed serial data stream to many slower and parallel streams and vies versa at the receiver. We implement this concept of multi channels using optical orthogonal frequency division multiplexing (O-OFDM) method. In our scheme, we use asymmetrically clipped optical OFDM (ACO-OFDM). Our results show that the realization of multi channels OFDM (ACO-OFDM) methods reduces the total energy consumption exponentially, as the number of channels transmitted through them rises.

  4. Skin friction drag reduction in turbulent flow using spanwise traveling surface waves

    NASA Astrophysics Data System (ADS)

    Musgrave, Patrick F.; Tarazaga, Pablo A.

    2017-04-01

    A major technological driver in current aircraft and other vehicles is the improvement of fuel efficiency. One way to increase the efficiency is to reduce the skin friction drag on these vehicles. This experimental study presents an active drag reduction technique which decreases the skin friction using spanwise traveling waves. A novel method is introduced for generating traveling waves which is low-profile, non-intrusive, and operates under various flow conditions. This wave generation method is discussed and the resulting traveling waves are presented. These waves are then tested in a low-speed wind tunnel to determine their drag reduction potential. To calculate the drag reduction, the momentum integral method is applied to turbulent boundary layer data collected using a pitot tube and traversing system. The skin friction coefficients are then calculated and the drag reduction determined. Preliminary results yielded a drag reduction of ≍ 5% for 244Hz traveling waves. Thus, this novel wave generation method possesses the potential to yield an easily implementable, non-invasive drag reduction technology.

  5. Carrier air wing mishap reduction using a human factors classification system and risk management.

    PubMed

    Belland, Kxis M; Olsen, Cara; Lawry, Russell

    2010-11-01

    In 1998, the Navy's center of excellence for advanced air wing combat operations, namely the Naval Strike and Air Warfare Center (NSAWC), had a spike in Class A flight mishaps. The spike triggered an intense review of prior mishaps and current mishap-reduction practices using the Human Factors Analysis and Classification System (HFACS). The review resulted in NSAWC instituting a comprehensive multifactorial mishap reduction plan applying Operational Risk Management (ORM) precepts. This is a nonrandomized investigational study with use of a historical comparison population. The Class A mishap rate per flight hour covering 10 yr prior to the mishap reduction efforts was estimated and compared to the Class A mishap rate per flight hour for the 10 yr after implementation using Poisson regression. Combined Fleet and NSAWC data shows a 27% reduction in mishap rate, but the 21% reduction in the Fleet alone was not statistically significant. The mishap reduction at NSAWC was statistically significant with an 84% reduction. Fallon carrier air wing mishap rates post-ORM mishap reduction efforts are approaching those seen in the Fleet, but are still elevated overall (3.7 vs. 2.4). The incidence rate ratio was 80% lower at Fallon than the rest of the Fleet, indicating a significantly greater reduction in NSAWC air wing mishaps and suggests focused aviation mishap reduction efforts in similar circumstances could result in similar reductions.

  6. Dimensionality reduction of collective motion by principal manifolds

    NASA Astrophysics Data System (ADS)

    Gajamannage, Kelum; Butail, Sachit; Porfiri, Maurizio; Bollt, Erik M.

    2015-01-01

    While the existence of low-dimensional embedding manifolds has been shown in patterns of collective motion, the current battery of nonlinear dimensionality reduction methods is not amenable to the analysis of such manifolds. This is mainly due to the necessary spectral decomposition step, which limits control over the mapping from the original high-dimensional space to the embedding space. Here, we propose an alternative approach that demands a two-dimensional embedding which topologically summarizes the high-dimensional data. In this sense, our approach is closely related to the construction of one-dimensional principal curves that minimize orthogonal error to data points subject to smoothness constraints. Specifically, we construct a two-dimensional principal manifold directly in the high-dimensional space using cubic smoothing splines, and define the embedding coordinates in terms of geodesic distances. Thus, the mapping from the high-dimensional data to the manifold is defined in terms of local coordinates. Through representative examples, we show that compared to existing nonlinear dimensionality reduction methods, the principal manifold retains the original structure even in noisy and sparse datasets. The principal manifold finding algorithm is applied to configurations obtained from a dynamical system of multiple agents simulating a complex maneuver called predator mobbing, and the resulting two-dimensional embedding is compared with that of a well-established nonlinear dimensionality reduction method.

  7. The effects of compensatory workplace exercises to reduce work-related stress and musculoskeletal pain1

    PubMed Central

    de Freitas-Swerts, Fabiana Cristina Taubert; Robazzi, Maria Lúcia do Carmo Cruz

    2014-01-01

    OBJECTIVES: to assess the effect of a compensatory workplace exercise program on workers with the purpose of reducing work-related stress and musculoskeletal pain. METHOD: quasi-experimental research with quantitative analysis of the data, involving 30 administrative workers from a Higher Education Public Institution. For data collection, questionnaires were used to characterize the workers, as well as the Workplace Stress Scale and the Corlett Diagram. The research took place in three stages: first: pre-test with the application of the questionnaires to the subjects; second: Workplace Exercise taking place twice a week, for 15 minutes, during a period of 10 weeks; third: post-test in which the subjects answered the questionnaires again. For data analysis, the descriptive statistics and non-parametric statistics were used through the Wilcoxon Test. RESULTS: work-related stress was present in the assessed workers, but there was no statistically significant reduction in the scores after undergoing Workplace Exercise. However, there was a statistically significant pain reduction in the neck, cervical, upper, middle and lower back, right thigh, left leg, right ankle and feet. CONCLUSION: the Workplace Exercise promoted a significant pain reduction in the spine, but did not result in a significant reduction in the levels of work-related stress. PMID:25296147

  8. Structured Ordinary Least Squares: A Sufficient Dimension Reduction approach for regressions with partitioned predictors and heterogeneous units.

    PubMed

    Liu, Yang; Chiaromonte, Francesca; Li, Bing

    2017-06-01

    In many scientific and engineering fields, advanced experimental and computing technologies are producing data that are not just high dimensional, but also internally structured. For instance, statistical units may have heterogeneous origins from distinct studies or subpopulations, and features may be naturally partitioned based on experimental platforms generating them, or on information available about their roles in a given phenomenon. In a regression analysis, exploiting this known structure in the predictor dimension reduction stage that precedes modeling can be an effective way to integrate diverse data. To pursue this, we propose a novel Sufficient Dimension Reduction (SDR) approach that we call structured Ordinary Least Squares (sOLS). This combines ideas from existing SDR literature to merge reductions performed within groups of samples and/or predictors. In particular, it leads to a version of OLS for grouped predictors that requires far less computation than recently proposed groupwise SDR procedures, and provides an informal yet effective variable selection tool in these settings. We demonstrate the performance of sOLS by simulation and present a first application to genomic data. The R package "sSDR," publicly available on CRAN, includes all procedures necessary to implement the sOLS approach. © 2016, The International Biometric Society.

  9. Influences of age, sex, and LDL-C change on cardiovascular risk reduction with pravastatin treatment in elderly Japanese patients: A post hoc analysis of data from the Pravastatin Anti-atherosclerosis Trial in the Elderly (PATE)

    PubMed Central

    Ouchi, Yasuyoshi; Ohashi, Yasuo; Ito, Hideki; Saito, Yasushi; Ishikawa, Toshitsugu; Akishita, Masahiro; Shibata, Taro; Nakamura, Haruo; Orimo, Hajime

    2006-01-01

    Background: The Pravastatin Anti-atherosclerosis Trial in the Elderly (PATE) found that the prevalence of cardiovascular events (CVEs) was significantly lower with standard-dose (10–20 mg/d) pravastatin treatment compared with low-dose (5 mg/d) pravastatin treatment in elderly (aged ⩾ 60 years) Japanese patients with hypercholesterolemia. Small differences in on-treatment total cholesterol and low-density lipoprotein cholesterol (LDL-C) levels between the 2 dose groups in the PATE study were associated with significant differences in CVE prevalence. However, the reasons for these differences have not been determined. How sex and age differences influence the effectiveness of pravastatin also remains unclear. Objectives: The aims of this study were to determine the relationship between reduction in LDL-C level and CVE risk reduction in the PATE study and to assess the effects of sex and age on the effectiveness of pravastatin treatment (assessed using CVE risk reduction). Methods: In this post hoc analysis, Cox regression analysis was performed to study the relationship between on-treatment (pravastatin 5–20 mg/d) LDL-C level and CVE risk reduction using age, sex, smoking status, presence of diabetes mellitus and/or hypertension, history of cardiovascular disease (CVD), and high-density lipoprotein cholesterol level as adjustment factors. To explore risk reduction due to unspecified mechanisms other than LDLrC reduction, an estimated Kaplan-Meier curve from the Cox regression analysis was calculated and compared with the empirical (observed) Kaplan-Meier curve. Results: A total of 665 patients (527 women, 138 men; mean [SD] age, 72.8 [5.7] years) were enrolled in PATE and were followed up for a mean of 3.9 years (range, 3–5 years). Of those patients, 50 men and 173 women were ⩾75 years of age. Data from 619 patients were included in the present analysis. In the calculation of model-based Kaplan-Meier curves, data from an additional 32 patients were excluded from the LDL-C analysis because there were no data on pretreatment LDL levels; hence, the data from 587 patients were analyzed. A reduction in LDL-C level of 20 mg/dL was associated with an estimated CVE risk reduction of 24.7% (hazard ratio [HR] = 0.753; 95% CI, 0.625-0.907; P = 0.003). Risk was reduced by 22.2% in patients aged <75 years (HR = 0.778; 95% CI, 0.598–1.013; P = NS) and 29.9% in patients aged ⩾75 years (HR = 0.701; 95% CI, 0.526–0.934; P = 0.015). The risk reductions were 19.8% in women (HR = 0.802; 95% CI, 0.645–0.996; P = 0.046) and 35.8% in men (HR = 0.642; 95% CI, 0.453–0.911; P = 0.013). The risk reduction was 32.4% in patients without a history of CVD at enrollment (HR = 0.676; 95% CI, 0.525–0.870; P = 0.002) and 15.1% in those with a history of CVD (HR = 0.849; 95% CI, 0.630–1.143; P= NS). The estimated Kaplan-Meier curve strongly suggested that the effects of pravastatin were only partially associated with changes in LDLrC level. Conclusions: The results from this post hoc analysis suggest that pravastatin 5 to 20 mg/d might elicit CVE risk reduction by mechanisms other than cholesterol-lowering effects alone. They also suggest that pravastatin treatment might be effective in reducing the risk for CVEs in both female and male patients aged ⩾75 years. PMID:24678100

  10. Influences of age, sex, and LDL-C change on cardiovascular risk reduction with pravastatin treatment in elderly Japanese patients: A post hoc analysis of data from the Pravastatin Anti-atherosclerosis Trial in the Elderly (PATE).

    PubMed

    Ouchi, Yasuyoshi; Ohashi, Yasuo; Ito, Hideki; Saito, Yasushi; Ishikawa, Toshitsugu; Akishita, Masahiro; Shibata, Taro; Nakamura, Haruo; Orimo, Hajime

    2006-07-01

    The Pravastatin Anti-atherosclerosis Trial in the Elderly (PATE) found that the prevalence of cardiovascular events (CVEs) was significantly lower with standard-dose (10-20 mg/d) pravastatin treatment compared with low-dose (5 mg/d) pravastatin treatment in elderly (aged ⩾ 60 years) Japanese patients with hypercholesterolemia. Small differences in on-treatment total cholesterol and low-density lipoprotein cholesterol (LDL-C) levels between the 2 dose groups in the PATE study were associated with significant differences in CVE prevalence. However, the reasons for these differences have not been determined. How sex and age differences influence the effectiveness of pravastatin also remains unclear. The aims of this study were to determine the relationship between reduction in LDL-C level and CVE risk reduction in the PATE study and to assess the effects of sex and age on the effectiveness of pravastatin treatment (assessed using CVE risk reduction). In this post hoc analysis, Cox regression analysis was performed to study the relationship between on-treatment (pravastatin 5-20 mg/d) LDL-C level and CVE risk reduction using age, sex, smoking status, presence of diabetes mellitus and/or hypertension, history of cardiovascular disease (CVD), and high-density lipoprotein cholesterol level as adjustment factors. To explore risk reduction due to unspecified mechanisms other than LDLrC reduction, an estimated Kaplan-Meier curve from the Cox regression analysis was calculated and compared with the empirical (observed) Kaplan-Meier curve. A total of 665 patients (527 women, 138 men; mean [SD] age, 72.8 [5.7] years) were enrolled in PATE and were followed up for a mean of 3.9 years (range, 3-5 years). Of those patients, 50 men and 173 women were ⩾75 years of age. Data from 619 patients were included in the present analysis. In the calculation of model-based Kaplan-Meier curves, data from an additional 32 patients were excluded from the LDL-C analysis because there were no data on pretreatment LDL levels; hence, the data from 587 patients were analyzed. A reduction in LDL-C level of 20 mg/dL was associated with an estimated CVE risk reduction of 24.7% (hazard ratio [HR] = 0.753; 95% CI, 0.625-0.907; P = 0.003). Risk was reduced by 22.2% in patients aged <75 years (HR = 0.778; 95% CI, 0.598-1.013; P = NS) and 29.9% in patients aged ⩾75 years (HR = 0.701; 95% CI, 0.526-0.934; P = 0.015). The risk reductions were 19.8% in women (HR = 0.802; 95% CI, 0.645-0.996; P = 0.046) and 35.8% in men (HR = 0.642; 95% CI, 0.453-0.911; P = 0.013). The risk reduction was 32.4% in patients without a history of CVD at enrollment (HR = 0.676; 95% CI, 0.525-0.870; P = 0.002) and 15.1% in those with a history of CVD (HR = 0.849; 95% CI, 0.630-1.143; P= NS). The estimated Kaplan-Meier curve strongly suggested that the effects of pravastatin were only partially associated with changes in LDLrC level. The results from this post hoc analysis suggest that pravastatin 5 to 20 mg/d might elicit CVE risk reduction by mechanisms other than cholesterol-lowering effects alone. They also suggest that pravastatin treatment might be effective in reducing the risk for CVEs in both female and male patients aged ⩾75 years.

  11. New reductions of the Astrographic Catalogue. Plate adjustments of the Algiers, Oxford I and II, and Vatican Zones.

    NASA Astrophysics Data System (ADS)

    Urban, S. E.; Martin, J. C.; Jackson, E. S.; Corbin, T. E.

    1996-07-01

    The U. S. Naval Observatory is in the process of making new reductions of the Astrographic Catalogue using a modern reference catalog, the ACRS, and new data analysis and reduction software. Currently ten AC zones have been reduced. This papers discusses the reduction models and results from the Algiers, Oxford I and II, and Vatican zones (those of the Cape zone are discussed elsewhere). The resulting star positions will be combined with those of the U.S. Naval Observatory's Twin Astrograph Catalog to produce a catalog of positions and proper motions in support of the Sloan Digital Sky Survey.

  12. Blade-Mounted Flap Control for BVI Noise Reduction Proof-of-Concept Test

    NASA Technical Reports Server (NTRS)

    Dawson, Seth; Hassan, Ahmed; Straub, Friedrich; Tadghighi, Hormoz

    1995-01-01

    This report describes a wind tunnel test of the McDonnell Douglas Helicopter Systems (MDHS) Active Flap Model Rotor at the NASA Langley 14- by 22-Foot Subsonic Tunnel. The test demonstrated that BVI noise reductions and vibration reductions were possible with the use of an active flap. Aerodynamic results supported the acoustic data trends, showing a reduction in the strength of the tip vortex with the deflection of the flap. Acoustic results showed that the flap deployment, depending on the peak deflection angle and azimuthal shift in its deployment schedule, can produce BVI noise reductions as much as 6 dB on the advancing and retreating sides. The noise reduction was accompanied by an increase in low frequency harmonic noise and high frequency broadband noise. A brief assessment of the effect of the flap on vibration showed that significant reductions were possible. The greatest vibration reductions (as much as 76%) were found in the four per rev pitching moment at the hub. Performance improvement cam results were inconclusive, as the improvements were predicted to be smaller than the resolution of the rotor balance.

  13. Technologies for Aircraft Noise Reduction

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.

    2006-01-01

    Technologies for aircraft noise reduction have been developed by NASA over the past 15 years through the Advanced Subsonic Technology (AST) Noise Reduction Program and the Quiet Aircraft Technology (QAT) project. This presentation summarizes highlights from these programs and anticipated noise reduction benefits for communities surrounding airports. Historical progress in noise reduction and technologies available for future aircraft/engine development are identified. Technologies address aircraft/engine components including fans, exhaust nozzles, landing gear, and flap systems. New "chevron" nozzles have been developed and implemented on several aircraft in production today that provide significant jet noise reduction. New engines using Ultra-High Bypass (UHB) ratios are projected to provide about 10 EPNdB (Effective Perceived Noise Level in decibels) engine noise reduction relative to the average fleet that was flying in 1997. Audio files are embedded in the presentation that estimate the sound levels for a 35,000 pound thrust engine for takeoff and approach power conditions. The predictions are based on actual model scale data that was obtained by NASA. Finally, conceptual pictures are shown that look toward future aircraft/propulsion systems that might be used to obtain further noise reduction.

  14. Overview of harm reduction in prisons in seven European countries.

    PubMed

    Sander, Gen; Scandurra, Alessio; Kamenska, Anhelita; MacNamara, Catherine; Kalpaki, Christina; Bessa, Cristina Fernandez; Laso, Gemma Nicolás; Parisi, Grazia; Varley, Lorraine; Wolny, Marcin; Moudatsou, Maria; Pontes, Nuno Henrique; Mannix-McNamara, Patricia; Libianchi, Sandro; Antypas, Tzanetos

    2016-10-07

    While the last decade has seen a growth of support for harm reduction around the world, the availability and accessibility of quality harm reduction services in prison settings is uneven and continues to be inadequate compared to the progress achieved in the broader community. This article provides a brief overview of harm reduction in prisons in Catalonia (Spain), Greece, Ireland, Italy, Latvia, Poland, and Portugal. While each country provides a wide range of harm reduction services in the broader community, the majority fail to provide these same services or the same quality of these services, in prison settings, in clear violation of international human rights law and minimum standards on the treatment of prisoners. Where harm reduction services have been available and easily accessible in prison settings for some time, better health outcomes have been observed, including significantly reduced rates of HIV and HCV incidence. While the provision of harm reduction in each of these countries' prisons varies considerably, certain key themes and lessons can be distilled, including around features of an enabling environment for harm reduction, resource allocation, collection of disaggregated data, and accessibility of services.

  15. 77 FR 28597 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ... population of the United States. This one-year clearance request seeks approval to pre- test: (1) Data..., expiration date 12/31/2014) data collection. The proposed pretest will test the data collection procedures...

  16. Bosch CO2 Reduction System Development

    NASA Technical Reports Server (NTRS)

    Holmes, R. F.; King, C. D.; Keller, E. E.

    1976-01-01

    Development of a Bosch process CO2 reduction unit was continued, and, by means of hardware modifications, the performance was substantially improved. Benefits of the hardware upgrading were demonstrated by extensive unit operation and data acquisition in the laboratory. This work was accomplished on a cold seal configuration of the Bosch unit.

  17. Alternative Fuels Data Center

    Science.gov Websites

    Idle Reduction Equipment Excise Tax Exemption Qualified on-board idle reduction devices and advanced insulation are exempt from the federal excise tax imposed on the retail sale of heavy-duty highway ) SmartWay Technology Program Federal Excise Tax Exemption website. The exemption applies to equipment that

  18. Special Diabetes Program for Indians: Retention in Cardiovascular Risk Reduction

    ERIC Educational Resources Information Center

    Manson, Spero M.; Jiang, Luohua; Zhang, Lijing; Beals, Janette; Acton, Kelly J.; Roubideaux, Yvette

    2011-01-01

    Purpose: This study examined the associations between participant and site characteristics and retention in a multisite cardiovascular disease risk reduction project. Design and Methods: Data were derived from the Special Diabetes Program for Indians Healthy Heart Demonstration Project, an intervention to reduce cardiovascular risk among American…

  19. 48 CFR 1652.215-70 - Rate Reduction for Defective Pricing or Defective Cost or Pricing Data.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Rate Reduction for... Carrier certifies to the Contracting Officer that, to the best of the Carrier's knowledge and belief, the... 36387, June 8, 2000; 70 FR 31383, June 1, 2005] ...

  20. 48 CFR 1652.215-70 - Rate Reduction for Defective Pricing or Defective Cost or Pricing Data.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 6 2014-10-01 2014-10-01 false Rate Reduction for... Carrier certifies to the Contracting Officer that, to the best of the Carrier's knowledge and belief, the... 36387, June 8, 2000; 70 FR 31383, June 1, 2005] ...

  1. 48 CFR 1652.215-70 - Rate Reduction for Defective Pricing or Defective Cost or Pricing Data.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Rate Reduction for... Carrier certifies to the Contracting Officer that, to the best of the Carrier's knowledge and belief, the... 36387, June 8, 2000; 70 FR 31383, June 1, 2005] ...

  2. 48 CFR 1652.215-70 - Rate Reduction for Defective Pricing or Defective Cost or Pricing Data.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 6 2012-10-01 2012-10-01 false Rate Reduction for... Carrier certifies to the Contracting Officer that, to the best of the Carrier's knowledge and belief, the... 36387, June 8, 2000; 70 FR 31383, June 1, 2005] ...

  3. 76 FR 7841 - Agency Information Collection Activities; Proposed Collections; Toxic Chemical Release Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... prevention and waste management data, including recycling information, for such chemicals. 42 U.S.C. 13106... reporting forms and associated instructions, but these changes are estimated to have a negligible effect on... source reduction activities, and provide additional optional information on source reduction, recycling...

  4. ESTIMATION OF MICROBIAL REDUCTIVE TRANSFORMATION RATES FOR CHLORINATED BENZENES AND PHENOLS USING A QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIP APPROACH

    EPA Science Inventory

    A set of literature data was used to derive several quantitative structure-activity relationships (QSARs) to predict the rate constants for the microbial reductive dehalogenation of chlorinated aromatics. Dechlorination rate constants for 25 chloroaromatics were corrected for th...

  5. Method for indexing and retrieving manufacturing-specific digital imagery based on image content

    DOEpatents

    Ferrell, Regina K.; Karnowski, Thomas P.; Tobin, Jr., Kenneth W.

    2004-06-15

    A method for indexing and retrieving manufacturing-specific digital images based on image content comprises three steps. First, at least one feature vector can be extracted from a manufacturing-specific digital image stored in an image database. In particular, each extracted feature vector corresponds to a particular characteristic of the manufacturing-specific digital image, for instance, a digital image modality and overall characteristic, a substrate/background characteristic, and an anomaly/defect characteristic. Notably, the extracting step includes generating a defect mask using a detection process. Second, using an unsupervised clustering method, each extracted feature vector can be indexed in a hierarchical search tree. Third, a manufacturing-specific digital image associated with a feature vector stored in the hierarchicial search tree can be retrieved, wherein the manufacturing-specific digital image has image content comparably related to the image content of the query image. More particularly, can include two data reductions, the first performed based upon a query vector extracted from a query image. Subsequently, a user can select relevant images resulting from the first data reduction. From the selection, a prototype vector can be calculated, from which a second-level data reduction can be performed. The second-level data reduction can result in a subset of feature vectors comparable to the prototype vector, and further comparable to the query vector. An additional fourth step can include managing the hierarchical search tree by substituting a vector average for several redundant feature vectors encapsulated by nodes in the hierarchical search tree.

  6. Selenium isotope fractionation during reduction by Fe(II)-Fe(III) hydroxide-sulfate (green rust)

    USGS Publications Warehouse

    Johnson, T.M.; Bullen, T.D.

    2003-01-01

    We have determined the extent of Se isotope fractionation induced by reduction of selenate by sulfate interlayered green rust (GRSO4), a Fe(II)-Fe(III) hydroxide-sulfate. This compound is known to reduce selenate to Se(0), and it is the only naturally relevant abiotic selenate reduction pathway documented to date. Se reduction reactions, when they occur in nature, greatly reduce Se mobility and bioavailability. Se stable isotope analysis shows promise as an indicator of Se reduction, and Se isotope fractionation by various Se reactions must be known in order to refine this tool. We measured the increase in the 80Se/76Se ratio of dissolved selenate as lighter isotopes were preferentially consumed during reduction by GRSO4. Six different experiments that used GRSO4 made by two methods, with varying solution compositions and pH, yielded identical isotopic fractionations. Regression of all the data yielded an instantaneous isotope fractionation of 7.36 ?? 0.24???. Selenate reduction by GRSO4 induces much greater isotopic fractionation than does bacterial selenate reduction. If selenate reduction by GRSO4 occurs in nature, it may be identifiable on the basis of its relatively large isotopic fractionation. ?? 2003 Elsevier Science Ltd.

  7. Catalytic Reduction of Hexavalent Uranium by Formic Acid; RIDUZIONE CATALITICA DELL'URANIO ESAVALENTE MEDIANTE ACIDO FORMICO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cogliati, G.; Lanz, R.; Lepscky, C.

    1963-10-01

    S>The catalytic reduction of U(VI) to U(IV) by means of formic acid has been studied, considering particularly the uranyl nltrate solutions, This process will be applied in the urania--thoria mixed fuel reprocessing plant, (PCUT). Various catalysts have been tested and the influence of formic acid concentration, temperature and catalyst concentration on the reaction rate have been determined. A possible reduction mechanism coherent with Ihe experimental data is discussed. (auth)

  8. 78 FR 66365 - Proposed Information Collection Activity; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-05

    ... for Needy Families (TANF) program, it imposed a new data requirement that States prepare and submit data verification procedures and replaced other data requirements with new versions including: the TANF Data Report, the SSP-MOE Data Report, the Caseload Reduction Documentation Process, and the Reasonable...

  9. Integration of computer-assisted fracture reduction system and a hybrid 3-DOF-RPS mechanism for assisting the orthopedic surgery

    NASA Astrophysics Data System (ADS)

    Irwansyah; Sinh, N. P.; Lai, J. Y.; Essomba, T.; Asbar, R.; Lee, P. Y.

    2018-02-01

    In this paper, we present study to integrate virtual fracture bone reduction simulation tool with a novel hybrid 3-DOF-RPS external fixator to relocate back bone fragments into their anatomically original position. A 3D model of fractured bone was reconstructed and manipulated using 3D design and modeling software, PhysiGuide. The virtual reduction system was applied to reduce a bilateral femoral shaft fracture type 32-A3. Measurement data from fracture reduction and fixation stages were implemented to manipulate the manipulator pose in patient’s clinical case. The experimental result presents that by merging both of those techniques will give more possibilities to reduce virtual bone reduction time, improve facial and shortest healing treatment.

  10. Unique phase identification of trimetallic copper iron manganese oxygen carrier using simultaneous differential scanning calorimetry/thermogravimetric analysis during chemical looping combustion reactions with methane

    DOE PAGES

    Benincosa, William; Siriwardane, Ranjani; Tian, Hanjing; ...

    2017-07-05

    Chemical looping combustion (CLC) is a promising combustion technology that generates heat and sequestration-ready carbon dioxide that is undiluted by nitrogen from the combustion of carbonaceous fuels with an oxygen carrier, or metal oxide. This process is highly dependent on the reactivity and stability of the oxygen carrier. The development of oxygen carriers remains one of the major barriers for commercialization of CLC. Synthetic oxygen carriers, consisting of multiple metal components, have demonstrated enhanced performance and improved CLC operation compared to single metal oxides. However, identification of the complex mixed metal oxide phases that form after calcination or during CLCmore » reactions has been challenging. Without an understanding of the dominant metal oxide phase, it is difficult to determine reaction parameters and the oxygen carrier reduction pathway, which are necessary for CLC reactor design. This is particularly challenging for complex multi-component oxygen carriers such as copper iron manganese oxide (CuFeMnO 4). This study aims to differentiate the unique phase formation of a highly reactive, complex trimetallic oxygen carrier, CuFeMnO 4, from its single and bimetallic counterparts using thermochemical and reaction data obtained from simultaneous differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA) during temperature programmed reductions (TPR) with methane. DSC/TGA experiments during TPR with methane provides heat flow data and corresponding reaction rate data that can be used to determine reaction routes and mechanisms during methane reduction. Furthermore, non-isothermal TPR data provides the advantage of distinguishing reactions that may not be observable in isothermal analysis. The detailed thermochemical and reaction data, obtained during TPR with methane, distinguished a unique reduction pathway for CuFeMnO 4 that differed from its single and bimetallic counterparts. This is remarkable since X-ray diffraction (XRD) data alone could not be used to distinguish the reactive trimetallic oxide phase due to overlapping peaks from various single and mixed metal oxides. The unique reduction pathway of CuFeMnO 4 was further characterized in this study using in-situ XRD TPR with methane to determine changes in the dominant trimetallic phase that influenced the thermochemical and reaction rate data.« less

  11. Unique phase identification of trimetallic copper iron manganese oxygen carrier using simultaneous differential scanning calorimetry/thermogravimetric analysis during chemical looping combustion reactions with methane

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benincosa, William; Siriwardane, Ranjani; Tian, Hanjing

    Chemical looping combustion (CLC) is a promising combustion technology that generates heat and sequestration-ready carbon dioxide that is undiluted by nitrogen from the combustion of carbonaceous fuels with an oxygen carrier, or metal oxide. This process is highly dependent on the reactivity and stability of the oxygen carrier. The development of oxygen carriers remains one of the major barriers for commercialization of CLC. Synthetic oxygen carriers, consisting of multiple metal components, have demonstrated enhanced performance and improved CLC operation compared to single metal oxides. However, identification of the complex mixed metal oxide phases that form after calcination or during CLCmore » reactions has been challenging. Without an understanding of the dominant metal oxide phase, it is difficult to determine reaction parameters and the oxygen carrier reduction pathway, which are necessary for CLC reactor design. This is particularly challenging for complex multi-component oxygen carriers such as copper iron manganese oxide (CuFeMnO 4). This study aims to differentiate the unique phase formation of a highly reactive, complex trimetallic oxygen carrier, CuFeMnO 4, from its single and bimetallic counterparts using thermochemical and reaction data obtained from simultaneous differential scanning calorimetry (DSC) and thermogravimetric analysis (TGA) during temperature programmed reductions (TPR) with methane. DSC/TGA experiments during TPR with methane provides heat flow data and corresponding reaction rate data that can be used to determine reaction routes and mechanisms during methane reduction. Furthermore, non-isothermal TPR data provides the advantage of distinguishing reactions that may not be observable in isothermal analysis. The detailed thermochemical and reaction data, obtained during TPR with methane, distinguished a unique reduction pathway for CuFeMnO 4 that differed from its single and bimetallic counterparts. This is remarkable since X-ray diffraction (XRD) data alone could not be used to distinguish the reactive trimetallic oxide phase due to overlapping peaks from various single and mixed metal oxides. The unique reduction pathway of CuFeMnO 4 was further characterized in this study using in-situ XRD TPR with methane to determine changes in the dominant trimetallic phase that influenced the thermochemical and reaction rate data.« less

  12. Effects of band selection on endmember extraction for forestry applications

    NASA Astrophysics Data System (ADS)

    Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis

    2014-10-01

    In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.

  13. TERMITE: An R script for fast reduction of laser ablation inductively coupled plasma mass spectrometry data and its application to trace element measurements.

    PubMed

    Mischel, Simon A; Mertz-Kraus, Regina; Jochum, Klaus Peter; Scholz, Denis

    2017-07-15

    High spatial resolution Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICPMS) determination of trace element concentrations is of great interest for geological and environmental studies. Data reduction is a very important aspect of LA-ICP-MS, and several commercial programs for handling LA-ICPMS trace element data are available. Each of these software packages has its specific advantages and disadvantages. Here we present TERMITE, an R script for the reduction of LA-ICPMS data, which can reduce both spot and line scan measurements. Several parameters can be adjusted by the user, who does not necessarily need prior knowledge in R. Currently, ten reference materials with different matrices for calibration of LA-ICPMS data are implemented, and additional reference materials can be added by the user. TERMITE also provides an optional outlier test, and the results are provided graphically (as a pdf file) as well as numerically (as a csv file). As an example, we apply TERMITE to a speleothem sample and compare the results with those obtained using the commercial software GLITTER. The two programs give similar results. TERMITE is particularly useful for samples that are homogeneous with respect to their major element composition (in particular for the element used as an internal standard) and when many measurements are performed using the same analytical parameters. In this case, data evaluation using TERMITE is much faster than with all other available software, and the concentrations of more than 100 single spot measurements can be calculated in less than a minute. TERMITE is an open-source software for the reduction of LA-ICPMS data, which is particularly useful for the fast, reproducible evaluation of large datasets of samples that are homogeneous with respect to their major element composition. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Advanced Hybrid On-Board Science Data Processor - SpaceCube 2.0

    NASA Technical Reports Server (NTRS)

    Flatley, Tom

    2010-01-01

    Topics include an overview of On-board science data processing, software upset mitigation, on-board data reduction, on-board products, HyspIRI demonstration testbed, SpaceCube 2.0 block diagram, and processor comparison.

  15. AstroCloud: An Agile platform for data visualization and specific analyzes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Molina, F. Z.; Salgado, R.; Bergel, A.; Infante, A.

    2017-07-01

    Nowadays, astronomers commonly run their own tools, or distributed computational packages, for data analysis and then visualizing the results with generic applications. This chain of processes comes at high cost: (a) analyses are manually applied, they are therefore difficult to be automatized, and (b) data have to be serialized, thus increasing the cost of parsing and saving intermediary data. We are developing AstroCloud, an agile visualization multipurpose platform intended for specific analyses of astronomical images (https://astrocloudy.wordpress.com). This platform incorporates domain-specific languages which make it easily extensible. AstroCloud supports customized plug-ins, which translate into time reduction on data analysis. Moreover, it also supports 2D and 3D rendering, including interactive features in real time. AstroCloud is under development, we are currently implementing different choices for data reduction and physical analyzes.

  16. Versatile Software Package For Near Real-Time Analysis of Experimental Data

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D.; Hoadley, Sherwood T.

    1998-01-01

    This paper provides an overview of a versatile software package developed for time- and frequency-domain analyses of experimental wind-tunnel data. This package, originally developed for analyzing data in the NASA Langley Transonic Dynamics Tunnel (TDT), is applicable for analyzing any time-domain data. A Matlab-based software package, TDT-analyzer, provides a compendium of commonly-required dynamic analysis functions in a user-friendly interactive and batch processing environment. TDT-analyzer has been used extensively to provide on-line near real-time and post-test examination and reduction of measured data acquired during wind tunnel tests of aeroelastically-scaled models of aircraft and rotorcraft as well as a flight test of the NASA High Alpha Research Vehicle (HARV) F-18. The package provides near real-time results in an informative and timely manner far exceeding prior methods of data reduction at the TDT.

  17. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  18. Observations of flat-spectrum radio sources at λ850μm from the James Clerk Maxwell Telescope II. April 2000 to June 2005

    NASA Astrophysics Data System (ADS)

    Jenness, T.; Robson, E. I.; Stevens, J. A.

    2010-01-01

    Calibrated data for 143 flat-spectrum extragalactic radio sources are presented at a wavelength of 850μm covering a 5-yr period from 2000 April. The data, obtained at the James Clerk Maxwell Telescope using the Submillimetre Common-User Bolometer Array (SCUBA) camera in pointing mode, were analysed using an automated pipeline process based on the Observatory Reduction and Acquisition Control - Data Reduction (ORAC-DR) system. This paper describes the techniques used to analyse and calibrate the data, and presents the data base of results along with a representative sample of the better-sampled light curves. A re-analysis of previously published data from 1997 to 2000 is also presented. The combined catalogue, comprising 10493 flux density measurements, provides a unique and valuable resource for studies of extragalactic radio sources.

  19. Automated reduction of sub-millimetre single-dish heterodyne data from the James Clerk Maxwell Telescope using ORAC-DR

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Currie, Malcolm J.; Tilanus, Remo P. J.; Cavanagh, Brad; Berry, David S.; Leech, Jamie; Rizzi, Luca

    2015-10-01

    With the advent of modern multidetector heterodyne instruments that can result in observations generating thousands of spectra per minute it is no longer feasible to reduce these data as individual spectra. We describe the automated data reduction procedure used to generate baselined data cubes from heterodyne data obtained at the James Clerk Maxwell Telescope (JCMT). The system can automatically detect baseline regions in spectra and automatically determine regridding parameters, all without input from a user. Additionally, it can detect and remove spectra suffering from transient interference effects or anomalous baselines. The pipeline is written as a set of recipes using the ORAC-DR pipeline environment with the algorithmic code using Starlink software packages and infrastructure. The algorithms presented here can be applied to other heterodyne array instruments and have been applied to data from historical JCMT heterodyne instrumentation.

  20. A novel data reduction technique for single slanted hot-wire measurements used to study incompressible compressor tip leakage flows

    NASA Astrophysics Data System (ADS)

    Berdanier, Reid A.; Key, Nicole L.

    2016-03-01

    The single slanted hot-wire technique has been used extensively as a method for measuring three velocity components in turbomachinery applications. The cross-flow orientation of probes with respect to the mean flow in rotating machinery results in detrimental prong interference effects when using multi-wire probes. As a result, the single slanted hot-wire technique is often preferred. Typical data reduction techniques solve a set of nonlinear equations determined by curve fits to calibration data. A new method is proposed which utilizes a look-up table method applied to a simulated triple-wire sensor with application to turbomachinery environments having subsonic, incompressible flows. Specific discussion regarding corrections for temperature and density changes present in a multistage compressor application is included, and additional consideration is given to the experimental error which accompanies each data reduction process. Hot-wire data collected from a three-stage research compressor with two rotor tip clearances are used to compare the look-up table technique with the traditional nonlinear equation method. The look-up table approach yields velocity errors of less than 5 % for test conditions deviating by more than 20 °C from calibration conditions (on par with the nonlinear solver method), while requiring less than 10 % of the computational processing time.

  1. Transonic Drag Reduction Through Trailing-Edge Blowing on the FAST-MAC Circulation Control Model

    NASA Technical Reports Server (NTRS)

    Chan, David T.; Jones, Gregory S.; Milholen, William E., II; Goodliff, Scott L.

    2017-01-01

    A third wind tunnel test of the FAST-MAC circulation control semi-span model was completed in the National Transonic Facility at the NASA Langley Research Center where the model was configured for transonic testing of the cruise configuration with 0deg flap detection to determine the potential for transonic drag reduction with the circulation control blowing. The model allowed independent control of four circulation control plenums producing a high momentum jet from a blowing slot near the wing trailing edge that was directed over a 15% chord simple-hinged ap. Recent upgrades to transonic semi-span flow control testing at the NTF have demonstrated an improvement to overall data repeatability, particularly for the drag measurement, that allows for increased confidence in the data results. The static thrust generated by the blowing slot was removed from the wind-on data using force and moment balance data from wind-o thrust tares. This paper discusses the impact of the trailing-edge blowing to the transonic aerodynamics of the FAST-MAC model in the cruise configuration, where at flight Reynolds numbers, the thrust-removed corrected data showed that an overall drag reduction and increased aerodynamic efficiency was realized as a consequence of the blowing.

  2. A new algorithm for five-hole probe calibration, data reduction, and uncertainty analysis

    NASA Technical Reports Server (NTRS)

    Reichert, Bruce A.; Wendt, Bruce J.

    1994-01-01

    A new algorithm for five-hole probe calibration and data reduction using a non-nulling method is developed. The significant features of the algorithm are: (1) two components of the unit vector in the flow direction replace pitch and yaw angles as flow direction variables; and (2) symmetry rules are developed that greatly simplify Taylor's series representations of the calibration data. In data reduction, four pressure coefficients allow total pressure, static pressure, and flow direction to be calculated directly. The new algorithm's simplicity permits an analytical treatment of the propagation of uncertainty in five-hole probe measurement. The objectives of the uncertainty analysis are to quantify uncertainty of five-hole results (e.g., total pressure, static pressure, and flow direction) and determine the dependence of the result uncertainty on the uncertainty of all underlying experimental and calibration measurands. This study outlines a general procedure that other researchers may use to determine five-hole probe result uncertainty and provides guidance to improve measurement technique. The new algorithm is applied to calibrate and reduce data from a rake of five-hole probes. Here, ten individual probes are mounted on a single probe shaft and used simultaneously. Use of this probe is made practical by the simplicity afforded by this algorithm.

  3. A review of the status and development of Kuwait's fisheries.

    PubMed

    Al-Husaini, M; Bishop, J M; Al-Foudari, H M; Al-Baz, A F

    2015-11-30

    The status of Kuwait's fisheries landings and relative abundance for major species was reviewed using research data from Kuwait Institute for Scientific Research and landing data from the Kuwait's Central Statistical Bureau. Landing data showed significant decreases for major commercial species such as zobaidy (Pampus argenteus), suboor (Tenualosa ilisha), hamoor (Epinephelus coioides), newaiby (Otolithes ruber) and hamra (Lutjanus malabaricus) while abundance data for the shrimp Penaeus semisulcatus showed significant reduction in the recent years mainly because of overfishing. The catch-rate data showed continuous decline for major species such as zobaidy, newaiby and hamoor, which indicate that stock abundances of these species are low. The reduction in stock abundance in context with changes in habitat quality, particularly the effects of reduced discharge of the Shatt Al-Arab, is discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. HYPERDIRE-HYPERgeometric functions DIfferential REduction: Mathematica-based packages for the differential reduction of generalized hypergeometric functions: Lauricella function FC of three variables

    NASA Astrophysics Data System (ADS)

    Bytev, Vladimir V.; Kniehl, Bernd A.

    2016-09-01

    We present a further extension of the HYPERDIRE project, which is devoted to the creation of a set of Mathematica-based program packages for manipulations with Horn-type hypergeometric functions on the basis of differential equations. Specifically, we present the implementation of the differential reduction for the Lauricella function FC of three variables. Catalogue identifier: AEPP_v4_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEPP_v4_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 3 No. of lines in distributed program, including test data, etc.: 243461 No. of bytes in distributed program, including test data, etc.: 61610782 Distribution format: tar.gz Programming language: Mathematica. Computer: All computers running Mathematica. Operating system: Operating systems running Mathematica. Classification: 4.4. Does the new version supersede the previous version?: No, it significantly extends the previous version. Nature of problem: Reduction of hypergeometric function FC of three variables to a set of basis functions. Solution method: Differential reduction. Reasons for new version: The extension package allows the user to handle the Lauricella function FC of three variables. Summary of revisions: The previous version goes unchanged. Running time: Depends on the complexity of the problem.

  5. Estimating risk reduction required to break even in a health promotion program.

    PubMed

    Ozminkowski, Ronald J; Goetzel, Ron Z; Santoro, Jan; Saenz, Betty-Jo; Eley, Christine; Gorsky, Bob

    2004-01-01

    To illustrate a formula to estimate the amount of risk reduction required to break even on a corporate health promotion program. A case study design was implemented. Base year (2001) health risk and medical expenditure data from the company, along with published information on the relationships between employee demographics, health risks, and medical expenditures, were used to forecast demographics, risks, and expenditures for 2002 through 2011 and estimate the required amount of risk reduction. Motorola. 52,124 domestic employees. Demographics included age, gender, race, and job type. Health risks for 2001 were measured via health risk appraisal. Risks were noted as either high or low and related to exercise/eating habits, body weight, blood pressure, blood sugar levels, cholesterol levels, depression, stress, smoking/drinking habits, and seat belt use. Medical claims for 2001 were used to calculate medical expenditures per employee. Assuming a dollar 282 per employee program cost, Motorola employees would need to reduce their lifestyle-related health risks by 1.08% to 1.42% per year to break even on health promotion programming, depending upon the discount rate. Higher or lower program investments would change the risk reduction percentages. Employers can use information from published studies, along with their own data, to estimate the amount of risk reduction required to break even on their health promotion programs.

  6. Pharmacodynamically optimized erythropoietin treatment combined with phlebotomy reduction predicted to eliminate blood transfusions in selected preterm infants.

    PubMed

    Rosebraugh, Matthew R; Widness, John A; Nalbant, Demet; Cress, Gretchen; Veng-Pedersen, Peter

    2014-02-01

    Preterm very-low-birth-weight (VLBW) infants weighing <1.5 kg at birth develop anemia, often requiring multiple red blood cell transfusions (RBCTx). Because laboratory blood loss is a primary cause of anemia leading to RBCTx in VLBW infants, our purpose was to simulate the extent to which RBCTx can be reduced or eliminated by reducing laboratory blood loss in combination with pharmacodynamically optimized erythropoietin (Epo) treatment. Twenty-six VLBW ventilated infants receiving RBCTx were studied during the first month of life. RBCTx simulations were based on previously published RBCTx criteria and data-driven Epo pharmacodynamic optimization of literature-derived RBC life span and blood volume data corrected for phlebotomy loss. Simulated pharmacodynamic optimization of Epo administration and reduction in phlebotomy by ≥ 55% predicted a complete elimination of RBCTx in 1.0-1.5 kg infants. In infants <1.0 kg with 100% reduction in simulated phlebotomy and optimized Epo administration, a 45% reduction in RBCTx was predicted. The mean blood volume drawn from all infants was 63 ml/kg: 33% required for analysis and 67% discarded. When reduced laboratory blood loss and optimized Epo treatment are combined, marked reductions in RBCTx in ventilated VLBW infants were predicted, particularly among those with birth weights >1.0 kg.

  7. The STARLINK software collection

    NASA Astrophysics Data System (ADS)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  8. SPHARA - A Generalized Spatial Fourier Analysis for Multi-Sensor Systems with Non-Uniformly Arranged Sensors: Application to EEG

    PubMed Central

    Graichen, Uwe; Eichardt, Roland; Fiedler, Patrique; Strohmeier, Daniel; Zanow, Frank; Haueisen, Jens

    2015-01-01

    Important requirements for the analysis of multichannel EEG data are efficient techniques for signal enhancement, signal decomposition, feature extraction, and dimensionality reduction. We propose a new approach for spatial harmonic analysis (SPHARA) that extends the classical spatial Fourier analysis to EEG sensors positioned non-uniformly on the surface of the head. The proposed method is based on the eigenanalysis of the discrete Laplace-Beltrami operator defined on a triangular mesh. We present several ways to discretize the continuous Laplace-Beltrami operator and compare the properties of the resulting basis functions computed using these discretization methods. We apply SPHARA to somatosensory evoked potential data from eleven volunteers and demonstrate the ability of the method for spatial data decomposition, dimensionality reduction and noise suppression. When employing SPHARA for dimensionality reduction, a significantly more compact representation can be achieved using the FEM approach, compared to the other discretization methods. Using FEM, to recover 95% and 99% of the total energy of the EEG data, on average only 35% and 58% of the coefficients are necessary. The capability of SPHARA for noise suppression is shown using artificial data. We conclude that SPHARA can be used for spatial harmonic analysis of multi-sensor data at arbitrary positions and can be utilized in a variety of other applications. PMID:25885290

  9. Flame: A Flexible Data Reduction Pipeline for Near-Infrared and Optical Spectroscopy

    NASA Astrophysics Data System (ADS)

    Belli, Sirio; Contursi, Alessandra; Davies, Richard I.

    2018-05-01

    We present flame, a pipeline for reducing spectroscopic observations obtained with multi-slit near-infrared and optical instruments. Because of its flexible design, flame can be easily applied to data obtained with a wide variety of spectrographs. The flexibility is due to a modular architecture, which allows changes and customizations to the pipeline, and relegates the instrument-specific parts to a single module. At the core of the data reduction is the transformation from observed pixel coordinates (x, y) to rectified coordinates (λ, γ). This transformation consists in the polynomial functions λ(x, y) and γ(x, y) that are derived from arc or sky emission lines and slit edge tracing, respectively. The use of 2D transformations allows one to wavelength-calibrate and rectify the data using just one interpolation step. Furthermore, the γ(x, y) transformation includes also the spatial misalignment between frames, which can be measured from a reference star observed simultaneously with the science targets. The misalignment can then be fully corrected during the rectification, without having to further resample the data. Sky subtraction can be performed via nodding and/or modeling of the sky spectrum; the combination of the two methods typically yields the best results. We illustrate the pipeline by showing examples of data reduction for a near-infrared instrument (LUCI at the Large Binocular Telescope) and an optical one (LRIS at the Keck telescope).

  10. Noise reduction methods for nucleic acid and macromolecule sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuller, Ivan K.; Di Ventra, Massimiliano; Balatsky, Alexander

    Methods, systems, and devices are disclosed for processing macromolecule sequencing data with substantial noise reduction. In one aspect, a method for reducing noise in a sequential measurement of a macromolecule comprising serial subunits includes cross-correlating multiple measured signals of a physical property of subunits of interest of the macromolecule, the multiple measured signals including the time data associated with the measurement of the signal, to remove or at least reduce signal noise that is not in the same frequency and in phase with the systematic signal contribution of the measured signals.

  11. The Effect of School Quality on Black-White Health Differences: Evidence From Segregated Southern Schools

    PubMed Central

    Frisvold, David; Golberstein, Ezra

    2013-01-01

    This study assesses the effect of black-white differences in school quality on black-white differences in health in later life resulting from the racial convergence in school quality for cohorts born between 1910 and 1950 in southern states with segregated schools. Using data from the 1984 through 2007 National Health Interview Surveys linked to race-specific data on school quality, we find that reductions in the black-white gap in school quality led to modest reductions in the black-white gap in disability. PMID:23839102

  12. Nasadem Global Elevation Model: Methods and Progress

    NASA Astrophysics Data System (ADS)

    Crippen, R.; Buckley, S.; Agram, P.; Belz, E.; Gurrola, E.; Hensley, S.; Kobrick, M.; Lavalle, M.; Martin, J.; Neumann, M.; Nguyen, Q.; Rosen, P.; Shimada, J.; Simard, M.; Tung, W.

    2016-06-01

    NASADEM is a near-global elevation model that is being produced primarily by completely reprocessing the Shuttle Radar Topography Mission (SRTM) radar data and then merging it with refined ASTER GDEM elevations. The new and improved SRTM elevations in NASADEM result from better vertical control of each SRTM data swath via reference to ICESat elevations and from SRTM void reductions using advanced interferometric unwrapping algorithms. Remnant voids will be filled primarily by GDEM3, but with reduction of GDEM glitches (mostly related to clouds) and therefore with only minor need for secondary sources of fill.

  13. Data reduction formulas for the 16-foot transonic tunnel: NASA Langley Research Center, revision 2

    NASA Technical Reports Server (NTRS)

    Mercer, Charles E.; Berrier, Bobby L.; Capone, Francis J.; Grayston, Alan M.

    1992-01-01

    The equations used by the 16-Foot Transonic Wind Tunnel in the data reduction programs are presented in nine modules. Each module consists of equations necessary to achieve a specific purpose. These modules are categorized in the following groups: (1) tunnel parameters; (2) jet exhaust measurements; (3) skin friction drag; (4) balance loads and model attitudes calculations; (5) internal drag (or exit-flow distribution); (6) pressure coefficients and integrated forces; (7) thrust removal options; (8) turboprop options; and (9) inlet distortion.

  14. Data-Driven Model Reduction and Transfer Operator Approximation

    NASA Astrophysics Data System (ADS)

    Klus, Stefan; Nüske, Feliks; Koltai, Péter; Wu, Hao; Kevrekidis, Ioannis; Schütte, Christof; Noé, Frank

    2018-06-01

    In this review paper, we will present different data-driven dimension reduction techniques for dynamical systems that are based on transfer operator theory as well as methods to approximate transfer operators and their eigenvalues, eigenfunctions, and eigenmodes. The goal is to point out similarities and differences between methods developed independently by the dynamical systems, fluid dynamics, and molecular dynamics communities such as time-lagged independent component analysis, dynamic mode decomposition, and their respective generalizations. As a result, extensions and best practices developed for one particular method can be carried over to other related methods.

  15. Peer mentoring of telescope operations and data reduction at Western Kentucky University

    NASA Astrophysics Data System (ADS)

    Williams, Joshua; Carini, M. T.

    2014-01-01

    Peer mentoring plays an important role in the astronomy program at Western Kentucky University. I will describe how undergraduates teach and mentor other undergraduates the basics of operating our 0.6m telescope and data reduction (IRAF) techniques. This peer to peer mentoring creates a community of undergraduate astronomy scholars at WKU. These scholars bond and help each other with research, coursework, social, and personal issues. This community atmosphere helps to draw in and retain other students interested in astronomy and other STEM careers.

  16. Interpretable dimensionality reduction of single cell transcriptome data with deep generative models.

    PubMed

    Ding, Jiarui; Condon, Anne; Shah, Sohrab P

    2018-05-21

    Single-cell RNA-sequencing has great potential to discover cell types, identify cell states, trace development lineages, and reconstruct the spatial organization of cells. However, dimension reduction to interpret structure in single-cell sequencing data remains a challenge. Existing algorithms are either not able to uncover the clustering structures in the data or lose global information such as groups of clusters that are close to each other. We present a robust statistical model, scvis, to capture and visualize the low-dimensional structures in single-cell gene expression data. Simulation results demonstrate that low-dimensional representations learned by scvis preserve both the local and global neighbor structures in the data. In addition, scvis is robust to the number of data points and learns a probabilistic parametric mapping function to add new data points to an existing embedding. We then use scvis to analyze four single-cell RNA-sequencing datasets, exemplifying interpretable two-dimensional representations of the high-dimensional single-cell RNA-sequencing data.

  17. Use of MAGSAT anomaly data for crustal structure and mineral resources in the US Midcontinent

    NASA Technical Reports Server (NTRS)

    Carmichael, R. S. (Principal Investigator)

    1981-01-01

    The analysis and preliminary interpretation of investigator-B MAGSAT data are addressed. The data processing included: (1) removal of spurious data points; (2) statistical smoothing along individual data tracks, to reduce the effect of geomagnetic transient disturbances; (3) comparison of data profiles spatially coincident in track location but acquired at different times; (4) reduction of data by weighted averaging to a grid with 1 deg xl deg latitude/longitude spacing, and with elevations interpolated and weighted to a common datum of 400 km; (5) wavelength filtering; and (6) reduction of the anomaly map to the magnetic pole. Agreement was found between a magnitude data anomaly map and a reduce-to-the-pole map supporting the general assumption that, on a large scale (long wavelength), it is induced crustal magnetization which is responsible for major anamalies. Anomalous features are identified and explanations are suggested with regard to crustal structure, petrologic characteristics, and Curie temperature isotherms.

  18. Multi-Rate Acquisition for Dead Time Reduction in Magnetic Resonance Receivers: Application to Imaging With Zero Echo Time.

    PubMed

    Marjanovic, Josip; Weiger, Markus; Reber, Jonas; Brunner, David O; Dietrich, Benjamin E; Wilm, Bertram J; Froidevaux, Romain; Pruessmann, Klaas P

    2018-02-01

    For magnetic resonance imaging of tissues with very short transverse relaxation times, radio-frequency excitation must be immediately followed by data acquisition with fast spatial encoding. In zero-echo-time (ZTE) imaging, excitation is performed while the readout gradient is already on, causing data loss due to an initial dead time. One major dead time contribution is the settling time of the filters involved in signal down-conversion. In this paper, a multi-rate acquisition scheme is proposed to minimize dead time due to filtering. Short filters and high output bandwidth are used initially to minimize settling time. With increasing time since the signal onset, longer filters with better frequency selectivity enable stronger signal decimation. In this way, significant dead time reduction is accomplished at only a slight increase in the overall amount of output data. Multi-rate acquisition was implemented with a two-stage filter cascade in a digital receiver based on a field-programmable gate array. In ZTE imaging in a phantom and in vivo, dead time reduction by multi-rate acquisition is shown to improve image quality and expand the feasible bandwidth while increasing the amount of data collected by only a few percent.

  19. Challenges and Consequences of Reduced Skilled Nursing Facility Lengths of Stay.

    PubMed

    Tyler, Denise A; McHugh, John P; Shield, Renée R; Winblad, Ulrika; Gadbois, Emily A; Mor, Vincent

    2018-06-05

    To identify the challenges that reductions in length of stay (LOS) pose for skilled nursing facilities (SNFs) and their postacute care (PAC) patients. Seventy interviews with staff in 25 SNFs in eight U.S. cities, LOS data for patients in those SNFs. Data were qualitatively analyzed, and key themes were identified. Interview data from SNFs with and without reductions in median risk-adjusted LOS were compared and contrasted. We conducted 70 semistructured interviews. LOS data were derived from minimum dataset (MDS) admission records available for all patients in all U.S. SNFs from 2012 to 2014. Challenges reported regardless of reductions in LOS included frequent and more complicated re-authorization processes, patients becoming responsible for costs, and discharging patients whom staff felt were unsafe at home. Challenges related to reduced LOS included SNFs being pressured to discharge patients within certain time limits. Some SNFs reported instituting programs and processes for following up with patients after discharge. These programs helped alleviate concerns about patients, but they resulted in nonreimbursable costs for facilities. The push for shorter LOS has resulted in unexpected challenges and costs for SNFs and possible unintended consequences for PAC patients. © Health Research and Educational Trust.

  20. Social Media Ratings of Minimally Invasive Fat Reduction Procedures: Benchmarking Against Traditional Liposuction.

    PubMed

    Talasila, Sreya; Evers-Meltzer, Rachel; Xu, Shuai

    2018-06-05

    Minimally invasive fat reduction procedures are rapidly growing in popularity. Evaluate online patient reviews to inform practice management. Data from RealSelf.com, a popular online aesthetics platform, were reviewed for all minimally invasive fat reduction procedures. Reviews were also aggregated based on the primary method of action (e.g., laser, radiofrequency, ultrasound, etc.) and compared with liposuction. A chi-square test was used to assess for differences with the Marascuilo procedure for pairwise comparisons. A total of 13 minimally invasive fat reduction procedures were identified encompassing 11,871 total reviews. Liposuction had 4,645 total reviews and a 66% patient satisfaction rate. Minimally invasive fat reduction procedures had 7,170 aggregate reviews and a global patient satisfaction of 58%. Liposuction had statistically significantly higher patient satisfaction than cryolipolysis (55% satisfied, n = 2,707 reviews), laser therapies (61% satisfied, n = 3,565 reviews), and injectables (49% satisfied, n = 319 reviews) (p < .05). Injectables and cryolipolysis had statistically significantly lower patient satisfaction than radiofrequency therapies (63% satisfied, n = 314 reviews) and laser therapies. Ultrasound therapies had 275 reviews and a 73% patient satisfaction rate. A large number of patient reviews suggest that minimally invasive fat reduction procedures have high patient satisfaction, although liposuction still had the highest total patient satisfaction score. However, there are significant pitfalls in interpreting patient reviews, as they do not provide important data such as a patient's medical history or physician experience and skill.

Top