Sample records for reduction method called

  1. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass and stiffness properties.

  2. MuLoG, or How to Apply Gaussian Denoisers to Multi-Channel SAR Speckle Reduction?

    PubMed

    Deledalle, Charles-Alban; Denis, Loic; Tabti, Sonia; Tupin, Florence

    2017-09-01

    Speckle reduction is a longstanding topic in synthetic aperture radar (SAR) imaging. Since most current and planned SAR imaging satellites operate in polarimetric, interferometric, or tomographic modes, SAR images are multi-channel and speckle reduction techniques must jointly process all channels to recover polarimetric and interferometric information. The distinctive nature of SAR signal (complex-valued, corrupted by multiplicative fluctuations) calls for the development of specialized methods for speckle reduction. Image denoising is a very active topic in image processing with a wide variety of approaches and many denoising algorithms available, almost always designed for additive Gaussian noise suppression. This paper proposes a general scheme, called MuLoG (MUlti-channel LOgarithm with Gaussian denoising), to include such Gaussian denoisers within a multi-channel SAR speckle reduction technique. A new family of speckle reduction algorithms can thus be obtained, benefiting from the ongoing progress in Gaussian denoising, and offering several speckle reduction results often displaying method-specific artifacts that can be dismissed by comparison between results.

  3. Evaluating the Workload of On-Call Psychiatry Residents: Which Activities Are Associated with Sleep Loss?

    ERIC Educational Resources Information Center

    Cooke, Brian K.; Cooke, Erinn O.; Sharfstein, Steven S.

    2012-01-01

    Objective: The purpose of this study was to review the workload inventory of on-call psychiatry residents and to evaluate which activities were associated with reductions in on-call sleep. Method: A prospective cohort study was conducted, following 20 psychiatry residents at a 231-bed psychiatry hospital, from July 1, 2008 through June 30, 2009.…

  4. A general soft label based linear discriminant analysis for semi-supervised dimensionality reduction.

    PubMed

    Zhao, Mingbo; Zhang, Zhao; Chow, Tommy W S; Li, Bing

    2014-07-01

    Dealing with high-dimensional data has always been a major problem in research of pattern recognition and machine learning, and Linear Discriminant Analysis (LDA) is one of the most popular methods for dimension reduction. However, it only uses labeled samples while neglecting unlabeled samples, which are abundant and can be easily obtained in the real world. In this paper, we propose a new dimension reduction method, called "SL-LDA", by using unlabeled samples to enhance the performance of LDA. The new method first propagates label information from the labeled set to the unlabeled set via a label propagation process, where the predicted labels of unlabeled samples, called "soft labels", can be obtained. It then incorporates the soft labels into the construction of scatter matrixes to find a transformed matrix for dimension reduction. In this way, the proposed method can preserve more discriminative information, which is preferable when solving the classification problem. We further propose an efficient approach for solving SL-LDA under a least squares framework, and a flexible method of SL-LDA (FSL-LDA) to better cope with datasets sampled from a nonlinear manifold. Extensive simulations are carried out on several datasets, and the results show the effectiveness of the proposed method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Effects of Stigma and Discrimination Reduction Programs Conducted Under the California Mental Health Services Authority

    PubMed Central

    Cerully, Jennifer L.; Collins, Rebecca L.; Wong, Eunice C.; Roth, Elizabeth; Marks, Joyce; Yu, Jennifer

    2016-01-01

    Abstract Describes the methods and results of a RAND evaluation of stigma and discrimination reduction efforts by Runyon Saltzman Einhorn, Inc., involving screenings of a documentary film called “A New State of Mind: Ending the Stigma of Mental Illness.” PMID:28083418

  6. Cluster Correspondence Analysis.

    PubMed

    van de Velden, M; D'Enza, A Iodice; Palumbo, F

    2017-03-01

    A method is proposed that combines dimension reduction and cluster analysis for categorical data by simultaneously assigning individuals to clusters and optimal scaling values to categories in such a way that a single between variance maximization objective is achieved. In a unified framework, a brief review of alternative methods is provided and we show that the proposed method is equivalent to GROUPALS applied to categorical data. Performance of the methods is appraised by means of a simulation study. The results of the joint dimension reduction and clustering methods are compared with the so-called tandem approach, a sequential analysis of dimension reduction followed by cluster analysis. The tandem approach is conjectured to perform worse when variables are added that are unrelated to the cluster structure. Our simulation study confirms this conjecture. Moreover, the results of the simulation study indicate that the proposed method also consistently outperforms alternative joint dimension reduction and clustering methods.

  7. Effects of Stigma and Discrimination Reduction Programs Conducted Under the California Mental Health Services Authority: An Evaluation of Runyon Saltzman Einhorn, Inc., Documentary Screening Events.

    PubMed

    Cerully, Jennifer L; Collins, Rebecca L; Wong, Eunice C; Roth, Elizabeth; Marks, Joyce; Yu, Jennifer

    2016-05-09

    Describes the methods and results of a RAND evaluation of stigma and discrimination reduction efforts by Runyon Saltzman Einhorn, Inc., involving screenings of a documentary film called "A New State of Mind: Ending the Stigma of Mental Illness."

  8. Robust Measurements of Phase Response Curves Realized via Multicycle Weighted Spike-Triggered Averages

    NASA Astrophysics Data System (ADS)

    Imai, Takashi; Ota, Kaiichiro; Aoyagi, Toshio

    2017-02-01

    Phase reduction has been extensively used to study rhythmic phenomena. As a result of phase reduction, the rhythm dynamics of a given system can be described using the phase response curve. Measuring this characteristic curve is an important step toward understanding a system's behavior. Recently, a basic idea for a new measurement method (called the multicycle weighted spike-triggered average method) was proposed. This paper confirms the validity of this method by providing an analytical proof and demonstrates its effectiveness in actual experimental systems by applying the method to an oscillating electric circuit. Some practical tips to use the method are also presented.

  9. 76 FR 51944 - Proposed Information Collection; Comment Request; Southeast Region Bycatch Reduction Device...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ....) fisheries of the exclusive economic zone (EEZ) off the South Atlantic, Caribbean, and Gulf of Mexico under.... Method of Collection Paper applications, electronic reports, and telephone calls are required from participants, and methods of submittal include Internet, electronic forms, and facsimile transmission of paper...

  10. Launch Control System Master Console Event Message Reduction

    NASA Technical Reports Server (NTRS)

    Nguyen, Uyen

    2014-01-01

    System monitoring and control (SMC) message browsers receive so many messages daily that operators do not need to see. Important messages are often mixed up among the less important ones. My job is to reduce the messages displayed in the message browser so that warning and emergency messages can be seen easily and therefore, responded promptly. There are multiple methods to achieve this. Firstly, duplicate messages should not appear many times in the message browser. Instead, the message should appear only once but with a number that counts the times that it appears. This method is called duplicate message suppression. Secondly, messages that update the most recent state (e.g. up/down) of a component should replace the old-state messages. This method is called state based message correlation. Thirdly, messages that display "normal" alarm level should be suppressed unless it's a response to an operator action. In addition to message reduction, I also work on correcting the severity level and text formats on messages.

  11. Towards a Novel Integrated Approach for Estimating Greenhouse Gas Emissions in Support of International Agreements

    NASA Astrophysics Data System (ADS)

    Reimann, S.; Vollmer, M. K.; Henne, S.; Brunner, D.; Emmenegger, L.; Manning, A.; Fraser, P. J.; Krummel, P. B.; Dunse, B. L.; DeCola, P.; Tarasova, O. A.

    2016-12-01

    In the recently adopted Paris Agreement the community of signatory states has agreed to limit the future global temperature increase between +1.5 °C and +2.0 °C, compared to pre-industrial times. To achieve this goal, emission reduction targets have been submitted by individual nations (called Intended Nationally Determined Contributions, INDCs). Inventories will be used for checking progress towards these envisaged goals. These inventories are calculated by combining information on specific activities (e.g. passenger cars, agriculture) with activity-related, typically IPCC-sanctioned, emission factors - the so-called bottom-up method. These calculated emissions are reported on an annual basis and are checked by external bodies by using the same method. A second independent method estimates emissions by translating greenhouse gas measurements made at regionally representative stations into regional/global emissions using meteorologically-based transport models. In recent years this so-called top-down approach has been substantially advanced into a powerful tool and emission estimates at the national/regional level have become possible. This method is already used in Switzerland, in the United Kingdom and in Australia to estimate greenhouse gas emissions and independently support the national bottom-up emission inventories within the UNFCCC framework. Examples of the comparison of the two independent methods will be presented and the added-value will be discussed. The World Meteorological Organization (WMO) and partner organizations are currently developing a plan to expand this top-down approach and to expand the globally representative GAW network of ground-based stations and remote-sensing platforms and integrate their information with atmospheric transport models. This Integrated Global Greenhouse Gas Information System (IG3IS) initiative will help nations to improve the accuracy of their country-based emissions inventories and their ability to evaluate the success of emission reductions strategies. This could foster trans-national collaboration on methodologies for estimation of emissions. Furthermore, more accurate emission knowledge will clarify the value of emission reduction efforts and could encourage countries to strengthen their reduction pledges.

  12. Reduction in Fatalities, Ambulance Calls, and Hospital Admissions for Road Trauma After Implementation of New Traffic Laws

    PubMed Central

    Chan, Herbert; Brasher, Penelope; Erdelyi, Shannon; Desapriya, Edi; Asbridge, Mark; Purssell, Roy; Macdonald, Scott; Schuurman, Nadine; Pike, Ian

    2014-01-01

    Objectives. We evaluated the public health benefits of traffic laws targeting speeding and drunk drivers (British Columbia, Canada, September 2010). Methods. We studied fatal crashes and ambulance dispatches and hospital admissions for road trauma, using interrupted time series with multiple nonequivalent comparison series. We determined estimates of effect using linear regression models incorporating an autoregressive integrated moving average error term. We used neighboring jurisdictions (Alberta, Saskatchewan, Washington State) as external controls. Results. In the 2 years after implementation of the new laws, significant decreases occurred in fatal crashes (21.0%; 95% confidence interval [CI] = 15.3, 26.4) and in hospital admissions (8.0%; 95% CI = 0.6, 14.9) and ambulance calls (7.2%; 95% CI = 1.1, 13.0) for road trauma. We found a very large reduction in alcohol-related fatal crashes (52.0%; 95% CI = 34.5, 69.5), and the benefits of the new laws are likely primarily the result of a reduction in drinking and driving. Conclusions. These findings suggest that laws calling for immediate sanctions for dangerous drivers can reduce road trauma and should be supported. PMID:25121822

  13. Interior noise reduction by alternate resonance tuning

    NASA Technical Reports Server (NTRS)

    Bliss, Donald B.; Gottwald, James A.; Bryce, Jeffrey W.

    1987-01-01

    Existing interior noise reduction techniques for aircraft fuselages perform reasonably well at higher frequencies, but are inadequate at low frequencies, particularly with respect to the low blade passage harmonics with high forcing levels found in propeller aircraft. A method is studied which considers aircraft fuselages lined with panels alternately tuned to frequencies above and below the frequency that must be attenuated. Adjacent panel would oscillate at equal amplitude, to give equal acoustic source strength, but with opposite phase. Provided these adjacent panels are acoustically compact, the resulting cancellation causes the interior acoustic modes to be cut off, and therefore be nonpropagating and evanescent. This interior noise reduction method, called Alternate Resonance Tuning (ART), is being investigated theoretically and experimentally. Progress to date is discussed.

  14. Building America Top Innovations 2012: Reduced Call-Backs with High-Performance Production Builders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    none,

    2013-01-01

    This Building America Top Innovations profile describes ways Building America teams have helped builders cut call-backs. Harvard University study found builders who worked with Building America had a 50% drop in call-backs. One builder reported a 50-fold reduction in the incidence of pipe freezing, a 50% reduction in drywall cracking, and a 60% decline in call-backs.

  15. A component modes projection and assembly model reduction methodology for articulated, multi-flexible body structures

    NASA Technical Reports Server (NTRS)

    Lee, Allan Y.; Tsuha, Walter S.

    1993-01-01

    A two-stage model reduction methodology, combining the classical Component Mode Synthesis (CMS) method and the newly developed Enhanced Projection and Assembly (EP&A) method, is proposed in this research. The first stage of this methodology, called the COmponent Modes Projection and Assembly model REduction (COMPARE) method, involves the generation of CMS mode sets, such as the MacNeal-Rubin mode sets. These mode sets are then used to reduce the order of each component model in the Rayleigh-Ritz sense. The resultant component models are then combined to generate reduced-order system models at various system configurations. A composite mode set which retains important system modes at all system configurations is then selected from these reduced-order system models. In the second stage, the EP&A model reduction method is employed to reduce further the order of the system model generated in the first stage. The effectiveness of the COMPARE methodology has been successfully demonstrated on a high-order, finite-element model of the cruise-configured Galileo spacecraft.

  16. Reduction of speckle noise from optical coherence tomography images using multi-frame weighted nuclear norm minimization method

    NASA Astrophysics Data System (ADS)

    Thapa, Damber; Raahemifar, Kaamran; Lakshminarayanan, Vasudevan

    2015-12-01

    In this paper, we propose a speckle noise reduction method for spectral-domain optical coherence tomography (SD-OCT) images called multi-frame weighted nuclear norm minimization (MWNNM). This method is a direct extension of weighted nuclear norm minimization (WNNM) in the multi-frame framework since an adequately denoised image could not be achieved with single-frame denoising methods. The MWNNM method exploits multiple B-scans collected from a small area of a SD-OCT volumetric image, and then denoises and averages them together to obtain a high signal-to-noise ratio B-scan. The results show that the image quality metrics obtained by denoising and averaging only five nearby B-scans with MWNNM method is considerably better than those of the average image obtained by registering and averaging 40 azimuthally repeated B-scans.

  17. A Rapid Aerodynamic Design Procedure Based on Artificial Neural Networks

    NASA Technical Reports Server (NTRS)

    Rai, Man Mohan

    2001-01-01

    An aerodynamic design procedure that uses neural networks to model the functional behavior of the objective function in design space has been developed. This method incorporates several improvements to an earlier method that employed a strategy called parameter-based partitioning of the design space in order to reduce the computational costs associated with design optimization. As with the earlier method, the current method uses a sequence of response surfaces to traverse the design space in search of the optimal solution. The new method yields significant reductions in computational costs by using composite response surfaces with better generalization capabilities and by exploiting synergies between the optimization method and the simulation codes used to generate the training data. These reductions in design optimization costs are demonstrated for a turbine airfoil design study where a generic shape is evolved into an optimal airfoil.

  18. Substructural controller synthesis

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1989-01-01

    A decentralized design procedure which combines substructural synthesis, model reduction, decentralized controller design, subcontroller synthesis, and controller reduction is proposed for the control design of flexible structures. The structure to be controlled is decomposed into several substructures, which are modeled by component mode synthesis methods. For each substructure, a subcontroller is designed by using the linear quadratic optimal control theory. Then, a controller synthesis scheme called Substructural Controller Synthesis (SCS) is used to assemble the subcontrollers into a system controller, which is to be used to control the whole structure.

  19. Decreases in collision risk and derailments attributed to changing at-risk behavior process at Union Pacific.

    DOT National Transportation Integrated Search

    2009-09-01

    Changing At-Risk Behavior (CAB) is a safety process that is being conducted at Union Pacifics San Antonio Service Unit (SASU) with the aim of improving road and yard safety. CAB is an example of a proactive safety risk-reduction method, called Cle...

  20. Admixture Aberration Analysis: Application to Mapping in Admixed Population Using Pooled DNA

    NASA Astrophysics Data System (ADS)

    Bercovici, Sivan; Geiger, Dan

    Admixture mapping is a gene mapping approach used for the identification of genomic regions harboring disease susceptibility genes in the case of recently admixed populations such as African Americans. We present a novel method for admixture mapping, called admixture aberration analysis (AAA), that uses a DNA pool of affected admixed individuals. We demonstrate through simulations that AAA is a powerful and economical mapping method under a range of scenarios, capturing complex human diseases such as hypertension and end stage kidney disease. The method has a low false-positive rate and is robust to deviation from model assumptions. Finally, we apply AAA on 600 prostate cancer-affected African Americans, replicating a known risk locus. Simulation results indicate that the method can yield over 96% reduction in genotyping. Our method is implemented as a Java program called AAAmap and is freely available.

  1. Interval type-2 fuzzy PID controller for uncertain nonlinear inverted pendulum system.

    PubMed

    El-Bardini, Mohammad; El-Nagar, Ahmad M

    2014-05-01

    In this paper, the interval type-2 fuzzy proportional-integral-derivative controller (IT2F-PID) is proposed for controlling an inverted pendulum on a cart system with an uncertain model. The proposed controller is designed using a new method of type-reduction that we have proposed, which is called the simplified type-reduction method. The proposed IT2F-PID controller is able to handle the effect of structure uncertainties due to the structure of the interval type-2 fuzzy logic system (IT2-FLS). The results of the proposed IT2F-PID controller using a new method of type-reduction are compared with the other proposed IT2F-PID controller using the uncertainty bound method and the type-1 fuzzy PID controller (T1F-PID). The simulation and practical results show that the performance of the proposed controller is significantly improved compared with the T1F-PID controller. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Spectral Regression Discriminant Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Pan, Y.; Wu, J.; Huang, H.; Liu, J.

    2012-08-01

    Dimensionality reduction algorithms, which aim to select a small set of efficient and discriminant features, have attracted great attention for Hyperspectral Image Classification. The manifold learning methods are popular for dimensionality reduction, such as Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, a disadvantage of many manifold learning methods is that their computations usually involve eigen-decomposition of dense matrices which is expensive in both time and memory. In this paper, we introduce a new dimensionality reduction method, called Spectral Regression Discriminant Analysis (SRDA). SRDA casts the problem of learning an embedding function into a regression framework, which avoids eigen-decomposition of dense matrices. Also, with the regression based framework, different kinds of regularizes can be naturally incorporated into our algorithm which makes it more flexible. It can make efficient use of data points to discover the intrinsic discriminant structure in the data. Experimental results on Washington DC Mall and AVIRIS Indian Pines hyperspectral data sets demonstrate the effectiveness of the proposed method.

  3. Improved Hybrid Modeling of Spent Fuel Storage Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bibber, Karl van

    This work developed a new computational method for improving the ability to calculate the neutron flux in deep-penetration radiation shielding problems that contain areas with strong streaming. The “gold standard” method for radiation transport is Monte Carlo (MC) as it samples the physics exactly and requires few approximations. Historically, however, MC was not useful for shielding problems because of the computational challenge of following particles through dense shields. Instead, deterministic methods, which are superior in term of computational effort for these problems types but are not as accurate, were used. Hybrid methods, which use deterministic solutions to improve MC calculationsmore » through a process called variance reduction, can make it tractable from a computational time and resource use perspective to use MC for deep-penetration shielding. Perhaps the most widespread and accessible of these methods are the Consistent Adjoint Driven Importance Sampling (CADIS) and Forward-Weighted CADIS (FW-CADIS) methods. For problems containing strong anisotropies, such as power plants with pipes through walls, spent fuel cask arrays, active interrogation, and locations with small air gaps or plates embedded in water or concrete, hybrid methods are still insufficiently accurate. In this work, a new method for generating variance reduction parameters for strongly anisotropic, deep penetration radiation shielding studies was developed. This method generates an alternate form of the adjoint scalar flux quantity, Φ Ω, which is used by both CADIS and FW-CADIS to generate variance reduction parameters for local and global response functions, respectively. The new method, called CADIS-Ω, was implemented in the Denovo/ADVANTG software. Results indicate that the flux generated by CADIS-Ω incorporates localized angular anisotropies in the flux more effectively than standard methods. CADIS-Ω outperformed CADIS in several test problems. This initial work indicates that CADIS- may be highly useful for shielding problems with strong angular anisotropies. This is a benefit to the public by increasing accuracy for lower computational effort for many problems that have energy, security, and economic importance.« less

  4. 38 CFR 9.3 - Waiver or reduction of coverage.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...-time coverage is called or ordered to active duty or active duty for training under a call or order... paid. Termination or reduction of coverage is effective for the entire remaining period of active duty... termination of duty, a member reenters duty (in the same or another uniformed service), a waiver or reduction...

  5. Some variance reduction methods for numerical stochastic homogenization

    PubMed Central

    Blanc, X.; Le Bris, C.; Legoll, F.

    2016-01-01

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. PMID:27002065

  6. Improved safety culture and labor-management relations attributed to changing at-risk behavior process at Union Pacific.

    DOT National Transportation Integrated Search

    2009-09-01

    Changing At-Risk Behavior (CAB) is a safety process that is being conducted at Union Pacifics San Antonio Service Unit (SASU) with the aim of improving road and yard safety. CAB is an example of a proactive safety risk-reduction method called Clea...

  7. 77 FR 5747 - Security Zones, Seattle's Seafair Fleet Week Moving Vessels, Puget Sound, WA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-06

    ... establishment of security zones. We seek any comments or information that may lead to the discovery of a... This proposed rule would call for no new collection of information under the Paperwork Reduction Act of..., design, or operation; test methods; sampling procedures; and related management systems practices) that...

  8. Evaluation of wetland implementation strategies on phosphorus reduction at a watershed scale

    NASA Astrophysics Data System (ADS)

    Abouali, Mohammad; Nejadhashemi, A. Pouyan; Daneshvar, Fariborz; Adhikari, Umesh; Herman, Matthew R.; Calappi, Timothy J.; Rohn, Bridget G.

    2017-09-01

    Excessive nutrient use in agricultural practices is a major cause of water quality degradation around the world, which results in eutrophication of the freshwater systems. Among the nutrients, phosphorus enrichment has recently drawn considerable attention due to major environmental issues such as Lake Erie and Chesapeake Bay eutrophication. One approach for mitigating the impacts of excessive nutrients on water resources is the implementation of wetlands. However, proper site selection for wetland implementation is the key for effective water quality management at the watershed scale, which is the goal of this study. In this regard, three conventional and two pseudo-random targeting methods were considered. A watershed model called the Soil and Water Assessment Tool (SWAT) was coupled with another model called System for Urban Stormwater Treatment and Analysis IntegratioN (SUSTAIN) to simulate the impacts of wetland implementation scenarios in the Saginaw River watershed, located in Michigan. The inter-group similarities of the targeting strategies were investigated and it was shown that the level of similarity increases as the target area increases (0.54-0.86). In general, the conventional targeting method based on phosphorus load generated per unit area at the subwatershed scale had the highest average reduction among all the scenarios (44.46 t/year). However, when considering the total area of implemented wetlands, the conventional method based on long-term impacts of wetland implementation showed the highest amount of phosphorus reduction (36.44 t/year).

  9. Designing a more efficient, effective and safe Medical Emergency Team (MET) service using data analysis

    PubMed Central

    Bilgrami, Irma; Bain, Christopher; Webb, Geoffrey I.; Orosz, Judit; Pilcher, David

    2017-01-01

    Introduction Hospitals have seen a rise in Medical Emergency Team (MET) reviews. We hypothesised that the commonest MET calls result in similar treatments. Our aim was to design a pre-emptive management algorithm that allowed direct institution of treatment to patients without having to wait for attendance of the MET team and to model its potential impact on MET call incidence and patient outcomes. Methods Data was extracted for all MET calls from the hospital database. Association rule data mining techniques were used to identify the most common combinations of MET call causes, outcomes and therapies. Results There were 13,656 MET calls during the 34-month study period in 7936 patients. The most common MET call was for hypotension [31%, (2459/7936)]. These MET calls were strongly associated with the immediate administration of intra-venous fluid (70% [1714/2459] v 13% [739/5477] p<0.001), unless the patient was located on a respiratory ward (adjusted OR 0.41 [95%CI 0.25–0.67] p<0.001), had a cardiac cause for admission (adjusted OR 0.61 [95%CI 0.50–0.75] p<0.001) or was under the care of the heart failure team (adjusted OR 0.29 [95%CI 0.19–0.42] p<0.001). Modelling the effect of a pre-emptive management algorithm for immediate fluid administration without MET activation on data from a test period of 24 months following the study period, suggested it would lead to a 68.7% (2541/3697) reduction in MET calls for hypotension and a 19.6% (2541/12938) reduction in total METs without adverse effects on patients. Conclusion Routinely collected data and analytic techniques can be used to develop a pre-emptive management algorithm to administer intravenous fluid therapy to a specific group of hypotensive patients without the need to initiate a MET call. This could both lead to earlier treatment for the patient and less total MET calls. PMID:29281665

  10. Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) for a 3-D Flexible Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) are extended from single discipline analysis (aerodynamics only) to multidisciplinary analysis - in this case, static aero-structural analysis - and applied to a simple 3-D wing problem. The method aims to reduce the computational expense incurred in performing shape optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, Finite Element Method (FEM) structural analysis and sensitivity analysis tools. Results for this small problem show that the method reaches the same local optimum as conventional optimization. However, unlike its application to the win,, (single discipline analysis), the method. as I implemented here, may not show significant reduction in the computational cost. Similar reductions were seen in the two-design-variable (DV) problem results but not in the 8-DV results given here.

  11. Linear reduction method for predictive and informative tag SNP selection.

    PubMed

    He, Jingwu; Westbrooks, Kelly; Zelikovsky, Alexander

    2005-01-01

    Constructing a complete human haplotype map is helpful when associating complex diseases with their related SNPs. Unfortunately, the number of SNPs is very large and it is costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNPs that should be sequenced to a small number of informative representatives called tag SNPs. In this paper, we propose a new linear algebra-based method for selecting and using tag SNPs. We measure the quality of our tag SNP selection algorithm by comparing actual SNPs with SNPs predicted from selected linearly independent tag SNPs. Our experiments show that for sufficiently long haplotypes, knowing only 0.4% of all SNPs the proposed linear reduction method predicts an unknown haplotype with the error rate below 2% based on 10% of the population.

  12. 78 FR 68735 - Reduction or Suspension of Safe Harbor Contributions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-15

    ... forth in section 401(k)(3), called the actual deferral percentage (ADP) test, or one of the design-based... design-based safe harbor method under which a CODA is treated as satisfying the ADP test if the... the design-based alternatives in section 401(m)(10), 401(m)(11), or 401(m)(12). The ACP test in...

  13. Large-scale inverse model analyses employing fast randomized data reduction

    NASA Astrophysics Data System (ADS)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  14. Some variance reduction methods for numerical stochastic homogenization.

    PubMed

    Blanc, X; Le Bris, C; Legoll, F

    2016-04-28

    We give an overview of a series of recent studies devoted to variance reduction techniques for numerical stochastic homogenization. Numerical homogenization requires that a set of problems is solved at the microscale, the so-called corrector problems. In a random environment, these problems are stochastic and therefore need to be repeatedly solved, for several configurations of the medium considered. An empirical average over all configurations is then performed using the Monte Carlo approach, so as to approximate the effective coefficients necessary to determine the macroscopic behaviour. Variance severely affects the accuracy and the cost of such computations. Variance reduction approaches, borrowed from other contexts in the engineering sciences, can be useful. Some of these variance reduction techniques are presented, studied and tested here. © 2016 The Author(s).

  15. Aircraft interior noise reduction by alternate resonance tuning

    NASA Technical Reports Server (NTRS)

    Bliss, Donald B.; Gottwald, James A.; Srinivasan, Ramakrishna; Gustaveson, Mark B.

    1990-01-01

    Existing interior noise reduction techniques for aircraft fuselages perform reasonably well at higher frequencies, but are inadequate at lower frequencies, particularly with respect to the low blade passage harmonics with high forcing levels found in propeller aircraft. A method is being studied which considers aircraft fuselage lined with panels alternately tuned to frequencies above and below the frequency that must be attenuated. Adjacent panels would oscillate at equal amplitude, to give equal source strength, but with opposite phase. Provided these adjacent panels are acoustically compact, the resulting cancellation causes the interior acoustic modes to become cutoff, and therefore be non-propagating and evanescent. This interior noise reduction method, called Alternate Resonance Tuning (ART), is currently being investigated both theoretically and experimentally. This new concept has potential application to reducing interior noise due to the propellers in advanced turboprop aircraft as well as for existing aircraft configurations.

  16. New Similarity Reductions and Compacton Solutions for Boussinesq-Like Equations with Fully Nonlinear Dispersion

    NASA Astrophysics Data System (ADS)

    Yan, Zhen-Ya

    2001-10-01

    In this paper, similarity reductions of Boussinesq-like equations with nonlinear dispersion (simply called B(m,n) equations) utt=(u^n)xx+(u^m)xxxx, which is a generalized model of Boussinesq equation utt=(u^2)xx+uxxxx and modified Bousinesq equation utt=(u^3)xx+uxxxx, are considered by using the direct reduction method. As a result, several new types of similarity reductions are found. Based on the reduction equations and some simple transformations, we obtain the solitary wave solutions and compacton solutions (which are solitary waves with the property that after colliding with other compacton solutions, they re-emerge with the same coherent shape) of B(1,n) equations and B(m,m) equations, respectively. The project supported by National Key Basic Research Development Project Program of China under Grant No. G1998030600 and Doctoral Foundation of China under Grant No. 98014119

  17. Thermodynamic Analysis of Chemically Reacting Mixtures-Comparison of First and Second Order Models.

    PubMed

    Pekař, Miloslav

    2018-01-01

    Recently, a method based on non-equilibrium continuum thermodynamics which derives thermodynamically consistent reaction rate models together with thermodynamic constraints on their parameters was analyzed using a triangular reaction scheme. The scheme was kinetically of the first order. Here, the analysis is further developed for several first and second order schemes to gain a deeper insight into the thermodynamic consistency of rate equations and relationships between chemical thermodynamic and kinetics. It is shown that the thermodynamic constraints on the so-called proper rate coefficient are usually simple sign restrictions consistent with the supposed reaction directions. Constraints on the so-called coupling rate coefficients are more complex and weaker. This means more freedom in kinetic coupling between reaction steps in a scheme, i.e., in the kinetic effects of other reactions on the rate of some reaction in a reacting system. When compared with traditional mass-action rate equations, the method allows a reduction in the number of traditional rate constants to be evaluated from data, i.e., a reduction in the dimensionality of the parameter estimation problem. This is due to identifying relationships between mass-action rate constants (relationships which also include thermodynamic equilibrium constants) which have so far been unknown.

  18. Nonlinear dimensionality reduction of data lying on the multicluster manifold.

    PubMed

    Meng, Deyu; Leung, Yee; Fung, Tung; Xu, Zongben

    2008-08-01

    A new method, which is called decomposition-composition (D-C) method, is proposed for the nonlinear dimensionality reduction (NLDR) of data lying on the multicluster manifold. The main idea is first to decompose a given data set into clusters and independently calculate the low-dimensional embeddings of each cluster by the decomposition procedure. Based on the intercluster connections, the embeddings of all clusters are then composed into their proper positions and orientations by the composition procedure. Different from other NLDR methods for multicluster data, which consider associatively the intracluster and intercluster information, the D-C method capitalizes on the separate employment of the intracluster neighborhood structures and the intercluster topologies for effective dimensionality reduction. This, on one hand, isometrically preserves the rigid-body shapes of the clusters in the embedding process and, on the other hand, guarantees the proper locations and orientations of all clusters. The theoretical arguments are supported by a series of experiments performed on the synthetic and real-life data sets. In addition, the computational complexity of the proposed method is analyzed, and its efficiency is theoretically analyzed and experimentally demonstrated. Related strategies for automatic parameter selection are also examined.

  19. Robust Airfoil Optimization to Achieve Consistent Drag Reduction Over a Mach Range

    NASA Technical Reports Server (NTRS)

    Li, Wu; Huyse, Luc; Padula, Sharon; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    We prove mathematically that in order to avoid point-optimization at the sampled design points for multipoint airfoil optimization, the number of design points must be greater than the number of free-design variables. To overcome point-optimization at the sampled design points, a robust airfoil optimization method (called the profile optimization method) is developed and analyzed. This optimization method aims at a consistent drag reduction over a given Mach range and has three advantages: (a) it prevents severe degradation in the off-design performance by using a smart descent direction in each optimization iteration, (b) there is no random airfoil shape distortion for any iterate it generates, and (c) it allows a designer to make a trade-off between a truly optimized airfoil and the amount of computing time consumed. For illustration purposes, we use the profile optimization method to solve a lift-constrained drag minimization problem for 2-D airfoil in Euler flow with 20 free-design variables. A comparison with other airfoil optimization methods is also included.

  20. Potential of a nitrogen tetroxide spill or emission during movement from supplier to user

    NASA Technical Reports Server (NTRS)

    Watje, W. F.

    1978-01-01

    The type of equipment used to transport nitrogen tetroxide is described along with various methods of shipping utilized. Emphasis is placed on reduction of accident rate. Potential areas for leaks, emissions, or spills discussed include: accidental spills, transfer operations, sampling, and the so-called 'empty' trailer. Corrective actions taken to reduce the occurrence of these problems are briefly discussed.

  1. Global quasi-linearization (GQL) versus QSSA for a hydrogen-air auto-ignition problem.

    PubMed

    Yu, Chunkan; Bykov, Viatcheslav; Maas, Ulrich

    2018-04-25

    A recently developed automatic reduction method for systems of chemical kinetics, the so-called Global Quasi-Linearization (GQL) method, has been implemented to study and reduce the dimensions of a homogeneous combustion system. The results of application of the GQL and the Quasi-Steady State Assumption (QSSA) are compared. A number of drawbacks of the QSSA are discussed, i.e. the selection criteria of QSS-species and its sensitivity to system parameters, initial conditions, etc. To overcome these drawbacks, the GQL approach has been developed as a robust, automatic and scaling invariant method for a global analysis of the system timescale hierarchy and subsequent model reduction. In this work the auto-ignition problem of the hydrogen-air system is considered in a wide range of system parameters and initial conditions. The potential of the suggested approach to overcome most of the drawbacks of the standard approaches is illustrated.

  2. Protocol: a multi-level intervention program to reduce stress in 9-1-1 telecommunicators.

    PubMed

    Meischke, Hendrika; Lilly, Michelle; Beaton, Randal; Calhoun, Rebecca; Tu, Ann; Stangenes, Scott; Painter, Ian; Revere, Debra; Baseman, Janet

    2018-05-02

    Nationwide, emergency response systems depend on 9-1-1 telecommunicators to prioritize, triage, and dispatch assistance to those in distress. 9-1-1 call center telecommunicators (TCs) are challenged by acute and chronic workplace stressors: tense interactions with citizen callers in crisis; overtime; shift-work; ever-changing technologies; and negative work culture, including co-worker conflict. This workforce is also subject to routine exposures to secondary traumatization while handling calls involving emergency situations and while making time urgent, high stake decisions over the phone. Our study aims to test the effectiveness of a multi-part intervention to reduce stress in 9-1-1 TCs through an online mindfulness training and a toolkit containing workplace stressor reduction resources. The study employs a randomized controlled trial design with three data collection points. The multi-part intervention includes an individual-level online mindfulness training and a call center-level organizational stress reduction toolkit. 160 TCs will be recruited from 9-1-1 call centers, complete a baseline survey at enrollment, and are randomly assigned to an intervention or a control group. Intervention group participants will start a 7-week online mindfulness training developed in-house and tailored to 9-1-1 TCs and their call center environment; control participants will be "waitlisted" and start the training after the study period ends. Following the intervention group's completion of the mindfulness training, all participants complete a second survey. Next, the online toolkit with call-center wide stress reduction resources is made available to managers of all participating call centers. After 3 months, a third survey will be completed by all participants. The primary outcome is 9-1-1 TCs' self-reported symptoms of stress at three time points as measured by the C-SOSI (Calgary Symptoms of Stress Inventory). Secondary outcomes will include: perceptions of social work environment (measured by metrics of social support and network conflict); mindfulness; and perceptions of social work environment and mindfulness as mediators of stress reduction. This study will evaluate the effectiveness of an online mindfulness training and call center-wide stress reduction toolkit in reducing self-reported stress in 9-1-1 TCs. The results of this study will add to the growing body of research on worksite stress reduction programs. ClinicalTrials.gov Registration Number: NCT02961621 Registered on November 7, 2016 (retrospectively registered).

  3. Direct Telephonic Communication in a Heart Failure Transitional Care Program: An observational study

    PubMed Central

    Ota, Ken S.; Beutler, David S.; Sheikh, Hassam; Weiss, Jessica L.; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D.; Loli, Akil I.

    2013-01-01

    Background This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. Methods This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. Results A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the “Weekday Daytime” period, Eighty-five percent of the totals calls were made. There were 229 calls during the “Weekday Nights” period with 1.5 inbound calls per week. The “Total Weekend” calls were 10.2% of the total calls which equated to a weekly average of 8.8. Conclusions Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations. PMID:28352437

  4. Aircraft interior noise reduction by alternate resonance tuning

    NASA Technical Reports Server (NTRS)

    Bliss, Donald B.; Gottwald, James A.; Gustaveson, Mark B.; Burton, James R., III; Castellino, Craig

    1989-01-01

    Existing interior noise reduction techniques for aircraft fuselages perform reasonably well at higher frequencies, but are inadequate at lower, particularly with respect to the low blade passage harmonics with high forcing levels found in propeller aircraft. A method is being studied which considers aircraft fuselages lines with panels alternately tuned to frequencies above and below the frequency to be attenuated. Adjacent panels would oscillate at equal amplitude, to give equal source strength, but with opposite phase. Provided these adjacent panels are acoustically compact, the resulting cancellation causes the interior acoustic modes to become cut off and therefore be non-propagating and evanescent. This interior noise reduction method, called Alternate Resonance Tuning (ART), is currently being investigated both theoretically and experimentally. This new concept has potential application to reducing interior noise due to the propellers in advanced turboprop aircraft as well as for existing aircraft configurations. This program summarizes the work carried out at Duke University during the third semester of a contract supported by the Structural Acoustics Branch at NASA Langley Research Center.

  5. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  6. Brushing Your Spacecrafts Teeth: A Review of Biological Reduction Processes for Planetary Protection Missions

    NASA Technical Reports Server (NTRS)

    Pugel, D. E. (Betsy); Rummel, J. D.; Conley, Catharine

    2017-01-01

    Much like keeping your teeth clean, where you brush away biofilms that your dentist calls "plaque," there are various methods to clean spaceflight hardware of biological contamination, known as biological reduction processes. Different approaches clean your hardware's "teeth" in different ways and with different levels of effectiveness. We know that brushing at home with a simple toothbrush is convenient and has a different level of impact vs. getting your teeth cleaned at the dentist. In the same way, there are some approaches to biological reduction that may require simple tools or more complex implementation approaches (think about sonicating or just soaking your dentures, vs. brushing them). There are also some that are more effective for different degrees of cleanliness and still some that have materials compatibility concerns. In this article, we review known and NASA-certified approaches for biological reduction, pointing out materials compatibility concerns and areas where additional research is needed.

  7. Brushing Your Spacecrafts Teeth: A Review of Biological Reduction Processes for Planetary Protection Missions

    NASA Technical Reports Server (NTRS)

    Pugel, D.E. (Betsy); Rummel, J. D.; Conley, C. A.

    2017-01-01

    Much like keeping your teeth clean, where you brush away biofilms that your dentist calls plaque, there are various methods to clean spaceflight hardware of biological contamination, known as biological reduction processes. Different approaches clean your hardwares teeth in different ways and with different levels of effectiveness. We know that brushing at home with a simple toothbrush is convenient and has a different level of impact vs. getting your teeth cleaned at the dentist. In the same way, there are some approaches to biological reduction that may require simple tools or more complex implementation approaches (think about sonicating or just soaking your dentures, vs. brushing them). There are also some that are more effective for different degrees of cleanliness and still some that have materials compatibility concerns. In this article, we review known and NASA-certified approaches for biological reduction, pointing out materials compatibility concerns and areas where additional research is needed.

  8. An Eigensystem Realization Algorithm (ERA) for modal parameter identification and model reduction

    NASA Technical Reports Server (NTRS)

    Juang, J. N.; Pappa, R. S.

    1985-01-01

    A method, called the Eigensystem Realization Algorithm (ERA), is developed for modal parameter identification and model reduction of dynamic systems from test data. A new approach is introduced in conjunction with the singular value decomposition technique to derive the basic formulation of minimum order realization which is an extended version of the Ho-Kalman algorithm. The basic formulation is then transformed into modal space for modal parameter identification. Two accuracy indicators are developed to quantitatively identify the system modes and noise modes. For illustration of the algorithm, examples are shown using simulation data and experimental data for a rectangular grid structure.

  9. Dimensionality reduction of collective motion by principal manifolds

    NASA Astrophysics Data System (ADS)

    Gajamannage, Kelum; Butail, Sachit; Porfiri, Maurizio; Bollt, Erik M.

    2015-01-01

    While the existence of low-dimensional embedding manifolds has been shown in patterns of collective motion, the current battery of nonlinear dimensionality reduction methods is not amenable to the analysis of such manifolds. This is mainly due to the necessary spectral decomposition step, which limits control over the mapping from the original high-dimensional space to the embedding space. Here, we propose an alternative approach that demands a two-dimensional embedding which topologically summarizes the high-dimensional data. In this sense, our approach is closely related to the construction of one-dimensional principal curves that minimize orthogonal error to data points subject to smoothness constraints. Specifically, we construct a two-dimensional principal manifold directly in the high-dimensional space using cubic smoothing splines, and define the embedding coordinates in terms of geodesic distances. Thus, the mapping from the high-dimensional data to the manifold is defined in terms of local coordinates. Through representative examples, we show that compared to existing nonlinear dimensionality reduction methods, the principal manifold retains the original structure even in noisy and sparse datasets. The principal manifold finding algorithm is applied to configurations obtained from a dynamical system of multiple agents simulating a complex maneuver called predator mobbing, and the resulting two-dimensional embedding is compared with that of a well-established nonlinear dimensionality reduction method.

  10. On the deduction of chemical reaction pathways from measurements of time series of concentrations.

    PubMed

    Samoilov, Michael; Arkin, Adam; Ross, John

    2001-03-01

    We discuss the deduction of reaction pathways in complex chemical systems from measurements of time series of chemical concentrations of reacting species. First we review a technique called correlation metric construction (CMC) and show the construction of a reaction pathway from measurements on a part of glycolysis. Then we present two new improved methods for the analysis of time series of concentrations, entropy metric construction (EMC), and entropy reduction method (ERM), and illustrate (EMC) with calculations on a model reaction system. (c) 2001 American Institute of Physics.

  11. Resilient Software Systems

    DTIC Science & Technology

    2015-06-01

    and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create

  12. Anisotropic electrical conduction and reduction in dangling-bond density for polycrystalline Si films prepared by catalytic chemical vapor deposition

    NASA Astrophysics Data System (ADS)

    Niikura, Chisato; Masuda, Atsushi; Matsumura, Hideki

    1999-07-01

    Polycrystalline Si (poly-Si) films with high crystalline fraction and low dangling-bond density were prepared by catalytic chemical vapor deposition (Cat-CVD), often called hot-wire CVD. Directional anisotropy in electrical conduction, probably due to structural anisotropy, was observed for Cat-CVD poly-Si films. A novel method to separately characterize both crystalline and amorphous phases in poly-Si films using anisotropic electrical conduction was proposed. On the basis of results obtained by the proposed method and electron spin resonance measurements, reduction in dangling-bond density for Cat-CVD poly-Si films was achieved using the condition to make the quality of the included amorphous phase high. The properties of Cat-CVD poly-Si films are found to be promising in solar-cell applications.

  13. Spectral Data Reduction via Wavelet Decomposition

    NASA Technical Reports Server (NTRS)

    Kaewpijit, S.; LeMoigne, J.; El-Ghazawi, T.; Rood, Richard (Technical Monitor)

    2002-01-01

    The greatest advantage gained from hyperspectral imagery is that narrow spectral features can be used to give more information about materials than was previously possible with broad-band multispectral imagery. For many applications, the new larger data volumes from such hyperspectral sensors, however, present a challenge for traditional processing techniques. For example, the actual identification of each ground surface pixel by its corresponding reflecting spectral signature is still one of the most difficult challenges in the exploitation of this advanced technology, because of the immense volume of data collected. Therefore, conventional classification methods require a preprocessing step of dimension reduction to conquer the so-called "curse of dimensionality." Spectral data reduction using wavelet decomposition could be useful, as it does not only reduce the data volume, but also preserves the distinctions between spectral signatures. This characteristic is related to the intrinsic property of wavelet transforms that preserves high- and low-frequency features during the signal decomposition, therefore preserving peaks and valleys found in typical spectra. When comparing to the most widespread dimension reduction technique, the Principal Component Analysis (PCA), and looking at the same level of compression rate, we show that Wavelet Reduction yields better classification accuracy, for hyperspectral data processed with a conventional supervised classification such as a maximum likelihood method.

  14. Biofouling development on plasma treated samples versus layers coated samples

    NASA Astrophysics Data System (ADS)

    Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.

    2016-12-01

    Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.

  15. Using an integrated information system to reduce interruptions and the number of non-relevant contacts in the inpatient pharmacy at tertiary hospital.

    PubMed

    Binobaid, Saleh; Almeziny, Mohammed; Fan, Ip-Shing

    2017-07-01

    Patient care is provided by a multidisciplinary team of healthcare professionals intended for high-quality and safe patient care. Accordingly, the team must work synergistically and communicate efficiently. In many hospitals, nursing and pharmacy communication relies mainly on telephone calls. In fact, numerous studies have reported telephone calls as a source of interruption for both pharmacy and nursing operations; therefore, the workload increases and the chance of errors raises. This report describes the implementation of an integrated information system that possibly can reduce telephone calls through providing real-time tracking capabilities and sorting prescriptions urgency, thus significantly improving traceability of all prescriptions inside pharmacy. The research design is based on a quasi-experiment using pre-post testing using the continuous improvement approach. The improvement project is performed using a six-step method. A survey was conducted in Prince Sultan Military Medical City (PSMMC) to measure the volume and types of telephone calls before and after implementation to evaluate the impact of the new system. Beforehand of the system implementation, during the two-week measurement period, all pharmacies received 4466 calls and the majority were follow-up calls. Subsequently of the integrated system rollout, there was a significant reduction ( p  > 0.001) in the volume of telephone calls to 2630 calls; besides, the calls nature turned out to be more professional inquiries ( p  > 0.001). As a result, avoidable interruptions and workload were decreased.

  16. Suppression of Plutella xylostella and Trichoplusia ni in cole crops with attracticide formulations.

    PubMed

    Maxwell, Elly M; Fadamiro, Henry Y; McLaughlin, John R

    2006-08-01

    The three key lepidopteran pests of cole, Brassica oleracea L., crops in North America are diamondback moth, Plutella xylostella (L.) (Lepidoptera: Plutellidae); cabbage looper; Trichoplusia ni (Hübner) (Lepidoptera: Noctuidae); and imported cabbageworm, Pieris rapae (L.) (Lepidoptera: Pieridae). Two species-specific pheromone-based experimental attracticide formulations were evaluated against these pests: LastCall DBM for P. xylostella and LastCall CL for T. ni. No LastCall formulation was available against P. rapae. Laboratory toxicity experiments confirmed the effectiveness of each LastCall formulations in killing conspecific males that made contact. In replicated small plots of cabbage and collards in central Alabama, over four growing seasons (fall 2003, spring 2004, fall 2004, and spring 2005), an attracticide treatment receiving the two LastCall formulations, each applied multiple times at the rate of 1,600 droplets per acre, was compared against Bacillus thuringiensis. subspecies kursatki (Bt) spray at action threshold and a negative untreated control. Efficacy was measured by comparing among the three treatments male capture in pheromone-baited traps, larval counts in plots, and crop damage rating at harvest. LastCall provided significant reductions in crop damage comparable to Bt in three of the four seasons. Efficacy of LastCall was dependent upon lepidopteran population densities, which fluctuated from season to season. In general, reduction in crop damage was achieved with LastCall at low-to-moderate population densities of the three species, such as typically occurs in the fall in central Alabama, but not in the spring when high P. rapae population pressure typically occurs in central Alabama. Significant reductions in pheromone trap captures did not occur in LastCall plots, suggesting that elimination of males by the toxicant (permethrin), rather than interruption of sexual communication, was the main mechanism of effect.

  17. Speckle reduction of OCT images using an adaptive cluster-based filtering

    NASA Astrophysics Data System (ADS)

    Adabi, Saba; Rashedi, Elaheh; Conforto, Silvia; Mehregan, Darius; Xu, Qiuyun; Nasiriavanaki, Mohammadreza

    2017-02-01

    Optical coherence tomography (OCT) has become a favorable device in the dermatology discipline due to its moderate resolution and penetration depth. OCT images however contain grainy pattern, called speckle, due to the broadband source that has been used in the configuration of OCT. So far, a variety of filtering techniques is introduced to reduce speckle in OCT images. Most of these methods are generic and can be applied to OCT images of different tissues. In this paper, we present a method for speckle reduction of OCT skin images. Considering the architectural structure of skin layers, it seems that a skin image can benefit from being segmented in to differentiable clusters, and being filtered separately in each cluster by using a clustering method and filtering methods such as Wiener. The proposed algorithm was tested on an optical solid phantom with predetermined optical properties. The algorithm was also tested on healthy skin images. The results show that the cluster-based filtering method can reduce the speckle and increase the signal-to-noise ratio and contrast while preserving the edges in the image.

  18. Mobile phone call data as a regional socio-economic proxy indicator.

    PubMed

    Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K; Ylä-Jääski, Antti

    2015-01-01

    The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's.

  19. Integrating social capacity into risk reduction strategies

    NASA Astrophysics Data System (ADS)

    Schneiderbauer, S.; Pedoth, L.; Zebisch, M.

    2012-04-01

    The reduction of risk to impacts from external stresses and shocks is an important task in communities worldwide at all government levels and independent of the development status. The importance of building social capacity as part of risk reduction strategies is increasingly recognized. However, there is space for improvement to incorporate related activities into a holistic risk governance approach. Starting point for such enhancements is to promote and improve assessments of what is called 'sensitivity' or 'adaptive capacity' in the climate change community and what is named 'vulnerability' or 'resilience' in the hazard risk community. Challenging issues that need to be tackled in this context are the integration of concepts and method as well as the fusion of data. Against this background we introduce a method to assess regional adaptive capacity to climate change focusing on mountain areas accounting for sector specific problems. By considering three levels of specificity as base for the selection of most appropriate indicators the study results have the potential to support decision making regarding most appropriate adaptation actions. Advantages and shortcomings of certain aspects of adaptive capacity assessment in general and of the proposed method in particular are presented.

  20. A note on an attempt at more efficient Poisson series evaluation. [for lunar libration

    NASA Technical Reports Server (NTRS)

    Shelus, P. J.; Jefferys, W. H., III

    1975-01-01

    A substantial reduction has been achieved in the time necessary to compute lunar libration series. The method involves eliminating many of the trigonometric function calls by a suitable transformation and applying a short SNOBOL processor to the FORTRAN coding of the transformed series, which obviates many of the multiplication operations during the course of series evaluation. It is possible to accomplish similar results quite easily with other Poisson series.

  1. Solitons, τ-functions and hamiltonian reduction for non-Abelian conformal affine Toda theories

    NASA Astrophysics Data System (ADS)

    Ferreira, L. A.; Miramontes, J. Luis; Guillén, Joaquín Sánchez

    1995-02-01

    We consider the Hamiltonian reduction of the "two-loop" Wess-Zumino-Novikov-Witten model (WZNW) based on an untwisted affine Kac-Moody algebra G. The resulting reduced models, called Generalized Non-Abelian Conformal Affine Toda (G-CAT), are conformally invariant and a wide class of them possesses soliton solutions; these models constitute non-Abelian generalizations of the conformal affine Toda models. Their general solution is constructed by the Leznov-Saveliev method. Moreover, the dressing transformations leading to the solutions in the orbit of the vacuum are considered in detail, as well as the τ-functions, which are defined for any integrable highest weight representation of G, irrespectively of its particular realization. When the conformal symmetry is spontaneously broken, the G-CAT model becomes a generalized affine Toda model, whose soliton solutions are constructed. Their masses are obtained exploring the spontaneous breakdown of the conformal symmetry, and their relation to the fundamental particle masses is discussed. We also introduce what we call the two-loop Virasoro algebra, describing extended symmetries of the two-loop WZNW models.

  2. Correlation Between Residual Displacement and Osteonecrosis of the Femoral Head Following Cannulated Screw Fixation of Femoral Neck Fractures.

    PubMed

    Wang, Chen; Xu, Gui-Jun; Han, Zhe; Jiang, Xuan; Zhang, Cheng-Bao; Dong, Qiang; Ma, Jian-Xiong; Ma, Xin-Long

    2015-11-01

    The aim of the study was to introduce a new method for measuring the residual displacement of the femoral head after internal fixation and explore the relationship between residual displacement and osteonecrosis with femoral head, and to evaluate the risk factors associated with osteonecrosis of the femoral head in patients with femoral neck fractures treated by closed reduction and percutaneous cannulated screw fixation.One hundred and fifty patients who sustained intracapsular femoral neck fractures between January 2011 and April 2013 were enrolled in the study. All were treated with closed reduction and percutaneous cannulated screw internal fixation. The residual displacement of the femoral head after surgery was measured by 3-dimensional reconstruction that evaluated the quality of the reduction. Other data that might affect prognosis were also obtained from outpatient follow-up, telephone calls, or case reviews. Multivariate logistic regression analysis was applied to assess the intrinsic relationship between the risk factors and the osteonecrosis of the femoral head.Osteonecrosis of the femoral head occurred in 27 patients (18%). Significant differences were observed regarding the residual displacement of the femoral head and the preoperative Garden classification. Moreover, we found more or less residual displacement of femoral head in all patients with high quality of reduction based on x-ray by the new technique. There was a close relationship between residual displacement and ONFH.There exists limitation to evaluate the quality of reduction by x-ray. Three-dimensional reconstruction and digital measurement, as a new method, is a more accurate method to assess the quality of reduction. Residual displacement of the femoral head and the preoperative Garden classification were risk factors for osteonecrosis of the femoral head. High-quality reduction was necessary to avoid complications.

  3. Online dimensionality reduction using competitive learning and Radial Basis Function network.

    PubMed

    Tomenko, Vladimir

    2011-06-01

    The general purpose dimensionality reduction method should preserve data interrelations at all scales. Additional desired features include online projection of new data, processing nonlinearly embedded manifolds and large amounts of data. The proposed method, called RBF-NDR, combines these features. RBF-NDR is comprised of two modules. The first module learns manifolds by utilizing modified topology representing networks and geodesic distance in data space and approximates sampled or streaming data with a finite set of reference patterns, thus achieving scalability. Using input from the first module, the dimensionality reduction module constructs mappings between observation and target spaces. Introduction of specific loss function and synthesis of the training algorithm for Radial Basis Function network results in global preservation of data structures and online processing of new patterns. The RBF-NDR was applied for feature extraction and visualization and compared with Principal Component Analysis (PCA), neural network for Sammon's projection (SAMANN) and Isomap. With respect to feature extraction, the method outperformed PCA and yielded increased performance of the model describing wastewater treatment process. As for visualization, RBF-NDR produced superior results compared to PCA and SAMANN and matched Isomap. For the Topic Detection and Tracking corpus, the method successfully separated semantically different topics. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Cavity radiation model for solar central receivers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lipps, F.W.

    1981-01-01

    The Energy Laboratory of the University of Houston has developed a computer simulation program called CREAM (i.e., Cavity Radiations Exchange Analysis Model) for application to the solar central receiver system. The zone generating capability of CREAM has been used in several solar re-powering studies. CREAM contains a geometric configuration factor generator based on Nusselt's method. A formulation of Nusselt's method provides support for the FORTRAN subroutine NUSSELT. Numerical results from NUSSELT are compared to analytic values and values from Sparrow's method. Sparrow's method is based on a double contour integral and its reduction to a single integral which is approximatedmore » by Guassian methods. Nusselt's method is adequate for the intended engineering applications, but Sparrow's method is found to be an order of magnitude more efficient in many situations.« less

  5. Future trends which will influence waste disposal.

    PubMed Central

    Wolman, A

    1978-01-01

    The disposal and management of solid wastes are ancient problems. The evolution of practices naturally changed as populations grew and sites for disposal became less acceptable. The central search was for easy disposal at minimum costs. The methods changed from indiscriminate dumping to sanitary landfill, feeding to swine, reduction, incineration, and various forms of re-use and recycling. Virtually all procedures have disabilities and rising costs. Many methods once abandoned are being rediscovered. Promises for so-called innovations outstrip accomplishments. Markets for salvage vary widely or disappear completely. The search for conserving materials and energy at minimum cost must go on forever. PMID:570105

  6. Efficacy of water treatment processes and endemic gastrointestinal illness - A multi-city study in Sweden.

    PubMed

    Tornevi, Andreas; Simonsson, Magnus; Forsberg, Bertil; Säve-Söderbergh, Melle; Toljander, Jonas

    2016-10-01

    Outbreaks of acute gastrointestinal illnesses (AGI) have been linked to insufficient drinking water treatment on numerous occasions in the industrialized world, but it is largely unknown to what extent public drinking water influences the endemic level of AGI. This paper aimed to examine endemic AGI and the relationship with pathogen elimination efficacy in public drinking water treatment processes. For this reason, time series data of all telephone calls to the Swedish National Healthcare Guide between November 2007 and February 2014 from twenty Swedish cities were obtained. Calls concerning vomiting, diarrhea or abdominal pain (AGI calls) were separated from other concerns (non-AGI calls). Information on which type of microbial barriers each drinking water treatment plant in these cities have been used were obtained, together with the barriers' theoretical pathogen log reduction efficacy. The total log reduction in the drinking water plants varied between 0.0 and 6.1 units for viruses, 0.0-14.6 units for bacteria and 0.0-7.3 units regarding protozoans. To achieve one general efficacy parameter for each plant, a weighted mean value of the log reductions (WLR) was calculated, with the weights based on how commonly these pathogen groups cause AGI. The WLR in the plants varied between 0.0 and 6.4 units. The effect of different pathogen elimination efficacy on levels of AGI calls relative non-AGI calls was evaluated in regression models, controlling for long term trends, population size, age distribution, and climatological area. Populations receiving drinking water produced with higher total log reduction was associated with a lower relative number of AGI calls. In overall, AGI calls decreased by 4% (OR = 0.96, CI: 0.96-0.97) for each unit increase in the WLR. The findings apply to both groundwater and surface water study sites, but are particularly evident among surface water sites during seasons when viruses are the main cause of AGI. This study proposes that the endemic level of gastroenteritis can indeed be reduced with more advanced treatment processes at many municipal drinking water treatment plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Reduction of Flow Diagrams to Unfolded Form Modulo Snarls.

    DTIC Science & Technology

    1987-04-14

    the English name of the Greek letter zeta.) 1.) An unintelligent canonical method called the Ŗ-level crossbar/pole" representation (3cp). This... Second , it will make these pictorial representations (all of which go by the name fC. Even though this is an abuse of language , it is in the spirit...received an M.S. degree In computer and communications sciences from the University of Michigan. Bs Is currently teaching a course on assembly language

  8. The Perceived Effect of Duty Hour Restrictions on Learning Opportunities in the Intensive Care Unit.

    PubMed

    Sabri, Nessrine; Sun, Ning-Zi; Cummings, Beth-Ann; Jayaraman, Dev

    2015-03-01

    Many countries have reduced resident duty hours in an effort to promote patient safety and enhance resident quality of life. There are concerns that reducing duty hours may impact residents' learning opportunities. We (1) evaluated residents' perceptions of their current learning opportunities in a context of reduced duty hours, and (2) explored the perceived change in resident learning opportunities after call length was reduced from 24 continuous hours to 16 hours. We conducted an anonymous, cross-sectional online survey of 240 first-, second-, and third-year residents rotating through 3 McGill University-affiliated intensive care units (ICUs) in Montreal, Quebec, Canada, between July 1, 2012, and June 30, 2013. The survey investigated residents' perceptions of learning opportunities in both the 24-hour and 16-hour systems. Of 240 residents, 168 (70%) completed the survey. Of these residents, 63 (38%) had been exposed to both 24-hour and 16-hour call schedules. The majority of respondents (83%) reported that didactic teaching sessions held by ICU staff physicians were useful. However, of the residents trained in both approaches to overnight call, 44% reported a reduction in learner attendance at didactic teaching sessions, 48% reported a reduction in attendance at midday hospital rounds, and 40% reported a perceived reduction in self-directed reading after the implementation of the new call schedule. A substantial proportion of residents perceived a reduction in the attendance of instructor-directed and self-directed reading after the implementation of a 16-hour call schedule in the ICU.

  9. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  10. Silicene Catalyzed Reduction of Nitrobenzene to Aniline: a Computational Study

    NASA Astrophysics Data System (ADS)

    Morrissey, Christopher; He, Haiying

    The reduction of nitrobenzene to aniline has a broad range of applications in the production of rubbers, dyes, agrochemicals, and pharmaceuticals. Currently, use of metal catalysts is the most popular method of performing this reaction on a large scale. These metal catalysts usually require high-temperature and/or high-pressure reaction conditions, and produce hazardous chemicals. This has led to a call for more environmentally friendly nonmetal catalysts. Recent studies suggest that silicene, the recently discovered silicon counterpart of graphene, could potentially work as a nonmetal catalyst due to its unique electronic property and strong interactions with molecules containing nitrogen and oxygen. In this computational study, we have investigated the plausibility of using silicene as a catalyst for the reduction of nitrobenzene. Possible reaction mechanisms will be discussed with a highlight of the difference between silicene and metal catalysts. . All calculations were performed in the framework of density functional theory.

  11. LCS Master Console Event Message Reduction

    NASA Technical Reports Server (NTRS)

    Nguyen, Uyen

    2014-01-01

    System monitoring and control (SMC) message browsers receive so many messages daily that operators are unable to keep track all of them. Important messages are often mixed up among the less important ones. My job is to reduce the messages so that warning and emergency messages can be seen easily and therefore, responded promptly. There are multiple methods to achieve this. Firstly, messages that look the same should not appear many times in the message browser. Instead, the message should appear only once but with a number that counts the times that it appears. This method is called duplicate message suppression. Messages that display "normal" or "advisory" alarm level should be suppressed. Secondly, messages that update the most recent status of a system should replace the old-status messages. This method is called state based message correlation. Thirdly, some unnecessary messages should be sent straight to history after being displayed or not displayed at all. For an example, normal messages that are not a response to an operator's action should not be displayed. I also work on fixing messages that are not color-coded and formatted properly.

  12. Simultaneous minimization of leaf travel distance and tongue-and-groove effect for segmental intensity-modulated radiation therapy.

    PubMed

    Dai, Jianrong; Que, William

    2004-12-07

    This paper introduces a method to simultaneously minimize the leaf travel distance and the tongue-and-groove effect for IMRT leaf sequences to be delivered in segmental mode. The basic idea is to add a large enough number of openings through cutting or splitting existing openings for those leaf pairs with openings fewer than the number of segments so that all leaf pairs have the same number of openings. The cutting positions are optimally determined with a simulated annealing technique called adaptive simulated annealing. The optimization goal is set to minimize the weighted summation of the leaf travel distance and tongue-and-groove effect. Its performance was evaluated with 19 beams from three clinical cases; one brain, one head-and-neck and one prostate case. The results show that it can reduce the leaf travel distance and (or) tongue-and-groove effect; the reduction of the leaf travel distance reaches its maximum of about 50% when minimized alone; the reduction of the tongue-and-groove reaches its maximum of about 70% when minimized alone. The maximum reduction in the leaf travel distance translates to a 1 to 2 min reduction in treatment delivery time per fraction, depending on leaf speed. If the method is implemented clinically, it could result in significant savings in treatment delivery time, and also result in significant reduction in the wear-and-tear of MLC mechanics.

  13. When seconds count: A study of communication variables in the opening segment of emergency calls.

    PubMed

    Penn, Claire; Koole, Tom; Nattrass, Rhona

    2017-09-01

    The opening sequence of an emergency call influences the efficiency of the ambulance dispatch time. The greeting sequences in 105 calls to a South African emergency service were analysed. Initial results suggested the advantage of a specific two-part opening sequence. An on-site experiment aimed at improving call efficiency was conducted during one shift (1100 calls). Results indicated reduced conversational repairs and a significant reduction of 4 seconds in mean call length. Implications for systems and training are derived.

  14. A Multi Directional Perfect Reconstruction Filter Bank Designed with 2-D Eigenfilter Approach: Application to Ultrasound Speckle Reduction.

    PubMed

    Nagare, Mukund B; Patil, Bhushan D; Holambe, Raghunath S

    2017-02-01

    B-Mode ultrasound images are degraded by inherent noise called Speckle, which creates a considerable impact on image quality. This noise reduces the accuracy of image analysis and interpretation. Therefore, reduction of speckle noise is an essential task which improves the accuracy of the clinical diagnostics. In this paper, a Multi-directional perfect-reconstruction (PR) filter bank is proposed based on 2-D eigenfilter approach. The proposed method used for the design of two-dimensional (2-D) two-channel linear-phase FIR perfect-reconstruction filter bank. In this method, the fan shaped, diamond shaped and checkerboard shaped filters are designed. The quadratic measure of the error function between the passband and stopband of the filter has been used an objective function. First, the low-pass analysis filter is designed and then the PR condition has been expressed as a set of linear constraints on the corresponding synthesis low-pass filter. Subsequently, the corresponding synthesis filter is designed using the eigenfilter design method with linear constraints. The newly designed 2-D filters are used in translation invariant pyramidal directional filter bank (TIPDFB) for reduction of speckle noise in ultrasound images. The proposed 2-D filters give better symmetry, regularity and frequency selectivity of the filters in comparison to existing design methods. The proposed method is validated on synthetic and real ultrasound data which ensures improvement in the quality of ultrasound images and efficiently suppresses the speckle noise compared to existing methods.

  15. Condomless sex: gay men, barebacking, and harm reduction.

    PubMed

    Shernoff, Michael

    2006-04-01

    Social science research as well as a rise in sexually transmitted diseases and new HIV infections among men who have sex with men point to increasing numbers of gay men engaging in unprotected anal intercourse without condoms, a practice called "barebacking". There is some evidence that barebacking is linked to the rise of crystal methamphetamine use (by men of all races and socioeconomic groups) and surfing the Internet to locate sex partners, although these are not the only factors contributing to this phenomenon. This article summarizes current research findings on sexual risk taking among gay men, discusses psychosocial issues that contribute to barebacking, and suggests a harm-reduction approach to clinical work with gay men who bareback as an effective method of addressing the behavior.

  16. An Analytical Assessment of NASA's N+1 Subsonic Fixed Wing Project Noise Goal

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Envia, Edmane; Burley, Casey L.

    2009-01-01

    The Subsonic Fixed Wing Project of NASA's Fundamental Aeronautics Program has adopted a noise reduction goal for new, subsonic, single-aisle, civil aircraft expected to replace current 737 and A320 airplanes. These so-called 'N+1' aircraft - designated in NASA vernacular as such since they will follow the current, in-service, 'N' airplanes - are hoped to achieve certification noise goal levels of 32 cumulative EPNdB under current Stage 4 noise regulations. A notional, N+1, single-aisle, twinjet transport with ultrahigh bypass ratio turbofan engines is analyzed in this study using NASA software and methods. Several advanced noise-reduction technologies are analytically applied to the propulsion system and airframe. Certification noise levels are predicted and compared with the NASA goal.

  17. Sublethal Effects of Neonicotinoid Insecticide on Calling Behavior and Pheromone Production of Tortricid Moths.

    PubMed

    Navarro-Roldán, Miguel A; Gemeno, César

    2017-09-01

    In moths, sexual behavior combines female sex pheromone production and calling behavior. The normal functioning of these periodic events requires an intact nervous system. Neurotoxic insecticide residues in the agroecosystem could impact the normal functioning of pheromone communication through alteration of the nervous system. In this study we assess whether sublethal concentrations of the neonicotinoid insecticide thiacloprid, that competitively modulates nicotinic acetylcholine receptors at the dendrite, affect pheromone production and calling behavior in adults of three economically important tortricid moth pests; Cydia pomonella (L.), Grapholita molesta (Busck), and Lobesia botrana (Denis & Schiffermüller). Thiacloprid significantly reduced the amount of calling in C. pomonella females at LC 0.001 (a lethal concentration that kills only 1 in 10 5 individuals), and altered its calling period at LC 1 , and in both cases the effect was dose-dependent. In the other two species the effect was similar but started at higher LCs, and the effect was relatively small in L. botrana. Pheromone production was altered only in C. pomonella, with a reduction of the major compound, codlemone, and one minor component, starting at LC 10 . Since sex pheromones and neonicotinoids are used together in the management of these three species, our results could have implications regarding the interaction between these two pest control methods.

  18. Mobile Phone Call Data as a Regional Socio-Economic Proxy Indicator

    PubMed Central

    Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K.; Ylä-Jääski, Antti

    2015-01-01

    The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's. PMID:25897957

  19. An adaptive proper orthogonal decomposition method for model order reduction of multi-disc rotor system

    NASA Astrophysics Data System (ADS)

    Jin, Yulin; Lu, Kuan; Hou, Lei; Chen, Yushu

    2017-12-01

    The proper orthogonal decomposition (POD) method is a main and efficient tool for order reduction of high-dimensional complex systems in many research fields. However, the robustness problem of this method is always unsolved, although there are some modified POD methods which were proposed to solve this problem. In this paper, a new adaptive POD method called the interpolation Grassmann manifold (IGM) method is proposed to address the weakness of local property of the interpolation tangent-space of Grassmann manifold (ITGM) method in a wider parametric region. This method is demonstrated here by a nonlinear rotor system of 33-degrees of freedom (DOFs) with a pair of liquid-film bearings and a pedestal looseness fault. The motion region of the rotor system is divided into two parts: simple motion region and complex motion region. The adaptive POD method is compared with the ITGM method for the large and small spans of parameter in the two parametric regions to present the advantage of this method and disadvantage of the ITGM method. The comparisons of the responses are applied to verify the accuracy and robustness of the adaptive POD method, as well as the computational efficiency is also analyzed. As a result, the new adaptive POD method has a strong robustness and high computational efficiency and accuracy in a wide scope of parameter.

  20. A robust multifactor dimensionality reduction method for detecting gene-gene interactions with application to the genetic analysis of bladder cancer susceptibility

    PubMed Central

    Gui, Jiang; Andrew, Angeline S.; Andrews, Peter; Nelson, Heather M.; Kelsey, Karl T.; Karagas, Margaret R.; Moore, Jason H.

    2010-01-01

    A central goal of human genetics is to identify and characterize susceptibility genes for common complex human diseases. An important challenge in this endeavor is the modeling of gene-gene interaction or epistasis that can result in non-additivity of genetic effects. The multifactor dimensionality reduction (MDR) method was developed as machine learning alternative to parametric logistic regression for detecting interactions in absence of significant marginal effects. The goal of MDR is to reduce the dimensionality inherent in modeling combinations of polymorphisms using a computational approach called constructive induction. Here, we propose a Robust Multifactor Dimensionality Reduction (RMDR) method that performs constructive induction using a Fisher’s Exact Test rather than a predetermined threshold. The advantage of this approach is that only those genotype combinations that are determined to be statistically significant are considered in the MDR analysis. We use two simulation studies to demonstrate that this approach will increase the success rate of MDR when there are only a few genotype combinations that are significantly associated with case-control status. We show that there is no loss of success rate when this is not the case. We then apply the RMDR method to the detection of gene-gene interactions in genotype data from a population-based study of bladder cancer in New Hampshire. PMID:21091664

  1. Reduced modeling of signal transduction – a modular approach

    PubMed Central

    Koschorreck, Markus; Conzelmann, Holger; Ebert, Sybille; Ederer, Michael; Gilles, Ernst Dieter

    2007-01-01

    Background Combinatorial complexity is a challenging problem in detailed and mechanistic mathematical modeling of signal transduction. This subject has been discussed intensively and a lot of progress has been made within the last few years. A software tool (BioNetGen) was developed which allows an automatic rule-based set-up of mechanistic model equations. In many cases these models can be reduced by an exact domain-oriented lumping technique. However, the resulting models can still consist of a very large number of differential equations. Results We introduce a new reduction technique, which allows building modularized and highly reduced models. Compared to existing approaches further reduction of signal transduction networks is possible. The method also provides a new modularization criterion, which allows to dissect the model into smaller modules that are called layers and can be modeled independently. Hallmarks of the approach are conservation relations within each layer and connection of layers by signal flows instead of mass flows. The reduced model can be formulated directly without previous generation of detailed model equations. It can be understood and interpreted intuitively, as model variables are macroscopic quantities that are converted by rates following simple kinetics. The proposed technique is applicable without using complex mathematical tools and even without detailed knowledge of the mathematical background. However, we provide a detailed mathematical analysis to show performance and limitations of the method. For physiologically relevant parameter domains the transient as well as the stationary errors caused by the reduction are negligible. Conclusion The new layer based reduced modeling method allows building modularized and strongly reduced models of signal transduction networks. Reduced model equations can be directly formulated and are intuitively interpretable. Additionally, the method provides very good approximations especially for macroscopic variables. It can be combined with existing reduction methods without any difficulties. PMID:17854494

  2. Surface Modification of Plastic Substrates Using Atomic Hydrogen

    NASA Astrophysics Data System (ADS)

    Heya, Akira; Matsuo, Naoto

    The surface properties of a plastic substrate were changed by a novel surface treatment called atomic hydrogen annealing (AHA). In this method, a plastic substrate was exposed to atomic hydrogen generated by cracking of hydrogen molecules on heated tungsten wire. Surface roughness was increased and halogen elements (F and Cl) were selectively etched by AHA. In addition, plastic surface was reduced by AHA. The surface can be modified by the recombination reaction of atomic hydrogen, the reduction reaction and selective etching of halogen atom. It is concluded that this method is a promising technique for improvement of adhesion between inorganic films and plastic substrates at low temperatures.

  3. Coarse analysis of collective behaviors: Bifurcation analysis of the optimal velocity model for traffic jam formation

    NASA Astrophysics Data System (ADS)

    Miura, Yasunari; Sugiyama, Yuki

    2017-12-01

    We present a general method for analyzing macroscopic collective phenomena observed in many-body systems. For this purpose, we employ diffusion maps, which are one of the dimensionality-reduction techniques, and systematically define a few relevant coarse-grained variables for describing macroscopic phenomena. The time evolution of macroscopic behavior is described as a trajectory in the low-dimensional space constructed by these coarse variables. We apply this method to the analysis of the traffic model, called the optimal velocity model, and reveal a bifurcation structure, which features a transition to the emergence of a moving cluster as a traffic jam.

  4. Discriminative components of data.

    PubMed

    Peltonen, Jaakko; Kaski, Samuel

    2005-01-01

    A simple probabilistic model is introduced to generalize classical linear discriminant analysis (LDA) in finding components that are informative of or relevant for data classes. The components maximize the predictability of the class distribution which is asymptotically equivalent to 1) maximizing mutual information with the classes, and 2) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments, the method outperformed, in addition to more classical methods, a Renyi entropy-based alternative while having essentially equivalent computational cost.

  5. Pressure garment design tool to monitor exerted pressures.

    PubMed

    Macintyre, Lisa; Ferguson, Rhona

    2013-09-01

    Pressure garments are used in the treatment of hypertrophic scarring following serious burns. The use of pressure garments is believed to hasten the maturation process, reduce pruritus associated with immature hypertrophic scars and prevent the formation of contractures over flexor joints. Pressure garments are normally made to measure for individual patients from elastic fabrics and are worn continuously for up to 2 years or until scar maturation. There are 2 methods of constructing pressure garments. The most common method, called the Reduction Factor method, involves reducing the patient's circumferential measurements by a certain percentage. The second method uses the Laplace Law to calculate the dimensions of pressure garments based on the circumferential measurements of the patient and the tension profile of the fabric. The Laplace Law method is complicated to utilise manually and no design tool is currently available to aid this process. This paper presents the development and suggested use of 2 new pressure garment design tools that will aid pressure garment design using the Reduction Factor and Laplace Law methods. Both tools calculate the pressure garment dimensions and the mean pressure that will be exerted around the body at each measurement point. Monitoring the pressures exerted by pressure garments and noting the clinical outcome would enable clinicians to build an understanding of the implications of particular pressures on scar outcome, maturation times and patient compliance rates. Once the optimum pressure for particular treatments is known, the Laplace Law method described in this paper can be used to deliver those average pressures to all patients. This paper also presents the results of a small scale audit of measurements taken for the fabrication of pressure garments in two UK hospitals. This audit highlights the wide range of pressures that are exerted using the Reduction Factor method and that manual pattern 'smoothing' can dramatically change the actual Reduction Factors used. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  6. Comparing and improving proper orthogonal decomposition (POD) to reduce the complexity of groundwater models

    NASA Astrophysics Data System (ADS)

    Gosses, Moritz; Nowak, Wolfgang; Wöhling, Thomas

    2017-04-01

    Physically-based modeling is a wide-spread tool in understanding and management of natural systems. With the high complexity of many such models and the huge amount of model runs necessary for parameter estimation and uncertainty analysis, overall run times can be prohibitively long even on modern computer systems. An encouraging strategy to tackle this problem are model reduction methods. In this contribution, we compare different proper orthogonal decomposition (POD, Siade et al. (2010)) methods and their potential applications to groundwater models. The POD method performs a singular value decomposition on system states as simulated by the complex (e.g., PDE-based) groundwater model taken at several time-steps, so-called snapshots. The singular vectors with the highest information content resulting from this decomposition are then used as a basis for projection of the system of model equations onto a subspace of much lower dimensionality than the original complex model, thereby greatly reducing complexity and accelerating run times. In its original form, this method is only applicable to linear problems. Many real-world groundwater models are non-linear, tough. These non-linearities are introduced either through model structure (unconfined aquifers) or boundary conditions (certain Cauchy boundaries, like rivers with variable connection to the groundwater table). To date, applications of POD focused on groundwater models simulating pumping tests in confined aquifers with constant head boundaries. In contrast, POD model reduction either greatly looses accuracy or does not significantly reduce model run time if the above-mentioned non-linearities are introduced. We have also found that variable Dirichlet boundaries are problematic for POD model reduction. An extension to the POD method, called POD-DEIM, has been developed for non-linear groundwater models by Stanko et al. (2016). This method uses spatial interpolation points to build the equation system in the reduced model space, thereby allowing the recalculation of system matrices at every time-step necessary for non-linear models while retaining the speed of the reduced model. This makes POD-DEIM applicable for groundwater models simulating unconfined aquifers. However, in our analysis, the method struggled to reproduce variable river boundaries accurately and gave no advantage for variable Dirichlet boundaries compared to the original POD method. We have developed another extension for POD that targets to address these remaining problems by performing a second POD operation on the model matrix on the left-hand side of the equation. The method aims to at least reproduce the accuracy of the other methods where they are applicable while outperforming them for setups with changing river boundaries or variable Dirichlet boundaries. We compared the new extension with original POD and POD-DEIM for different combinations of model structures and boundary conditions. The new method shows the potential of POD extensions for applications to non-linear groundwater systems and complex boundary conditions that go beyond the current, relatively limited range of applications. References: Siade, A. J., Putti, M., and Yeh, W. W.-G. (2010). Snapshot selection for groundwater model reduction using proper orthogonal decomposition. Water Resour. Res., 46(8):W08539. Stanko, Z. P., Boyce, S. E., and Yeh, W. W.-G. (2016). Nonlinear model reduction of unconfined groundwater flow using pod and deim. Advances in Water Resources, 97:130 - 143.

  7. A parallel algorithm for multi-level logic synthesis using the transduction method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lim, Chieng-Fai

    1991-01-01

    The Transduction Method has been shown to be a powerful tool in the optimization of multilevel networks. Many tools such as the SYLON synthesis system (X90), (CM89), (LM90) have been developed based on this method. A parallel implementation is presented of SYLON-XTRANS (XM89) on an eight processor Encore Multimax shared memory multiprocessor. It minimizes multilevel networks consisting of simple gates through parallel pruning, gate substitution, gate merging, generalized gate substitution, and gate input reduction. This implementation, called Parallel TRANSduction (PTRANS), also uses partitioning to break large circuits up and performs inter- and intra-partition dynamic load balancing. With this, good speedups and high processor efficiencies are achievable without sacrificing the resulting circuit quality.

  8. An accurate boundary element method for the exterior elastic scattering problem in two dimensions

    NASA Astrophysics Data System (ADS)

    Bao, Gang; Xu, Liwei; Yin, Tao

    2017-11-01

    This paper is concerned with a Galerkin boundary element method solving the two dimensional exterior elastic wave scattering problem. The original problem is first reduced to the so-called Burton-Miller [1] boundary integral formulation, and essential mathematical features of its variational form are discussed. In numerical implementations, a newly-derived and analytically accurate regularization formula [2] is employed for the numerical evaluation of hyper-singular boundary integral operator. A new computational approach is employed based on the series expansions of Hankel functions for the computation of weakly-singular boundary integral operators during the reduction of corresponding Galerkin equations into a discrete linear system. The effectiveness of proposed numerical methods is demonstrated using several numerical examples.

  9. An object-oriented data reduction system in Fortran

    NASA Technical Reports Server (NTRS)

    Bailey, J.

    1992-01-01

    A data reduction system for the AAO two-degree field project is being developed using an object-oriented approach. Rather than use an object-oriented language (such as C++) the system is written in Fortran and makes extensive use of existing subroutine libraries provided by the UK Starlink project. Objects are created using the extensible N-dimensional Data Format (NDF) which itself is based on the Hierarchical Data System (HDS). The software consists of a class library, with each class corresponding to a Fortran subroutine with a standard calling sequence. The methods of the classes provide operations on NDF objects at a similar level of functionality to the applications of conventional data reduction systems. However, because they are provided as callable subroutines, they can be used as building blocks for more specialist applications. The class library is not dependent on a particular software environment thought it can be used effectively in ADAM applications. It can also be used from standalone Fortran programs. It is intended to develop a graphical user interface for use with the class library to form the 2dF data reduction system.

  10. Flexible conformable hydrophobized surfaces for turbulent flow drag reduction

    NASA Astrophysics Data System (ADS)

    Brennan, Joseph C.; Geraldi, Nicasio R.; Morris, Robert H.; Fairhurst, David J.; McHale, Glen; Newton, Michael I.

    2015-05-01

    In recent years extensive work has been focused onto using superhydrophobic surfaces for drag reduction applications. Superhydrophobic surfaces retain a gas layer, called a plastron, when submerged underwater in the Cassie-Baxter state with water in contact with the tops of surface roughness features. In this state the plastron allows slip to occur across the surface which results in a drag reduction. In this work we report flexible and relatively large area superhydrophobic surfaces produced using two different methods: Large roughness features were created by electrodeposition on copper meshes; Small roughness features were created by embedding carbon nanoparticles (soot) into Polydimethylsiloxane (PDMS). Both samples were made into cylinders with a diameter under 12 mm. To characterize the samples, scanning electron microscope (SEM) images and confocal microscope images were taken. The confocal microscope images were taken with each sample submerged in water to show the extent of the plastron. The hydrophobized electrodeposited copper mesh cylinders showed drag reductions of up to 32% when comparing the superhydrophobic state with a wetted out state. The soot covered cylinders achieved a 30% drag reduction when comparing the superhydrophobic state to a plain cylinder. These results were obtained for turbulent flows with Reynolds numbers 10,000 to 32,500.

  11. Reduced graphene oxide supported gold nanoparticles for electrocatalytic reduction of carbon dioxide

    NASA Astrophysics Data System (ADS)

    Saquib, Mohammad; Halder, Aditi

    2018-02-01

    Electrochemical reduction of carbon dioxide is one of the methods which have the capability to recycle CO2 into valuable products for energy and industrial applications. This research article describes about a new electrocatalyst "reduced graphene oxide supported gold nanoparticles" for selective electrochemical conversion of carbon dioxide to carbon monoxide. The main aim for conversion of CO2 to CO lies in the fact that the latter is an important component of syn gas (a mixture of hydrogen and carbon monoxide), which is then converted into liquid fuel via well-known industrial process called Fischer-Tropsch process. In this work, we have synthesized different composites of the gold nanoparticles supported on defective reduced graphene oxide to evaluate the catalytic activity of reduced graphene oxide (RGO)-supported gold nanoparticles and the role of defective RGO support towards the electrochemical reduction of CO2. Electrochemical and impedance measurements demonstrate that higher concentration of gold nanoparticles on the graphene support led to remarkable decrease in the onset potential of 240 mV and increase in the current density for CO2 reduction. Lower impedance and Tafel slope values also clearly support our findings for the better performance of RGOAu than bare Au for CO2 reduction.

  12. An Analytical Assessment of NASA's N(+)1 Subsonic Fixed Wing Project Noise Goal

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Envia, Edmane; Burley, Casey L.

    2010-01-01

    The Subsonic Fixed Wing Project of NASA s Fundamental Aeronautics Program has adopted a noise reduction goal for new, subsonic, single-aisle, civil aircraft expected to replace current 737 and A320 airplanes. These so-called "N+1" aircraft--designated in NASA vernacular as such since they will follow the current, in-service, "N" airplanes--are hoped to achieve certification noise goal levels of 32 cumulative EPNdB under current Stage 4 noise regulations. A notional, N+1, single-aisle, twinjet transport with ultrahigh bypass ratio turbofan engines is analyzed in this study using NASA software and methods. Several advanced noise-reduction technologies are empirically applied to the propulsion system and airframe. Certification noise levels are predicted and compared with the NASA goal.

  13. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool

    PubMed Central

    Clark, Neil R.; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D.; Jones, Matthew R.; Ma’ayan, Avi

    2016-01-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community. PMID:26848405

  14. Principal Angle Enrichment Analysis (PAEA): Dimensionally Reduced Multivariate Gene Set Enrichment Analysis Tool.

    PubMed

    Clark, Neil R; Szymkiewicz, Maciej; Wang, Zichen; Monteiro, Caroline D; Jones, Matthew R; Ma'ayan, Avi

    2015-11-01

    Gene set analysis of differential expression, which identifies collectively differentially expressed gene sets, has become an important tool for biology. The power of this approach lies in its reduction of the dimensionality of the statistical problem and its incorporation of biological interpretation by construction. Many approaches to gene set analysis have been proposed, but benchmarking their performance in the setting of real biological data is difficult due to the lack of a gold standard. In a previously published work we proposed a geometrical approach to differential expression which performed highly in benchmarking tests and compared well to the most popular methods of differential gene expression. As reported, this approach has a natural extension to gene set analysis which we call Principal Angle Enrichment Analysis (PAEA). PAEA employs dimensionality reduction and a multivariate approach for gene set enrichment analysis. However, the performance of this method has not been assessed nor its implementation as a web-based tool. Here we describe new benchmarking protocols for gene set analysis methods and find that PAEA performs highly. The PAEA method is implemented as a user-friendly web-based tool, which contains 70 gene set libraries and is freely available to the community.

  15. Optimized and parallelized implementation of the electronegativity equalization method and the atom-bond electronegativity equalization method.

    PubMed

    Vareková, R Svobodová; Koca, J

    2006-02-01

    The most common way to calculate charge distribution in a molecule is ab initio quantum mechanics (QM). Some faster alternatives to QM have also been developed, the so-called "equalization methods" EEM and ABEEM, which are based on DFT. We have implemented and optimized the EEM and ABEEM methods and created the EEM SOLVER and ABEEM SOLVER programs. It has been found that the most time-consuming part of equalization methods is the reduction of the matrix belonging to the equation system generated by the method. Therefore, for both methods this part was replaced by the parallel algorithm WIRS and implemented within the PVM environment. The parallelized versions of the programs EEM SOLVER and ABEEM SOLVER showed promising results, especially on a single computer with several processors (compact PVM). The implemented programs are available through the Web page http://ncbr.chemi.muni.cz/~n19n/eem_abeem.

  16. A Varying Coefficient Model to Measure the Effectiveness of Mass Media Anti-Smoking Campaigns in Generating Calls to a Quitline

    PubMed Central

    Bui, Quang M.; Huggins, Richard M.; Hwang, Wen-Han; White, Victoria; Erbas, Bircan

    2010-01-01

    Background Anti-smoking advertisements are an effective population-based smoking reduction strategy. The Quitline telephone service provides a first point of contact for adults considering quitting. Because of data complexity, the relationship between anti-smoking advertising placement, intensity, and time trends in total call volume is poorly understood. In this study we use a recently developed semi-varying coefficient model to elucidate this relationship. Methods Semi-varying coefficient models comprise parametric and nonparametric components. The model is fitted to the daily number of calls to Quitline in Victoria, Australia to estimate a nonparametric long-term trend and parametric terms for day-of-the-week effects and to clarify the relationship with target audience rating points (TARPs) for the Quit and nicotine replacement advertising campaigns. Results The number of calls to Quitline increased with the TARP value of both the Quit and other smoking cessation advertisement; the TARP values associated with the Quit program were almost twice as effective. The varying coefficient term was statistically significant for peak periods with little or no advertising. Conclusions Semi-varying coefficient models are useful for modeling public health data when there is little or no information on other factors related to the at-risk population. These models are well suited to modeling call volume to Quitline, because the varying coefficient allowed the underlying time trend to depend on fixed covariates that also vary with time, thereby explaining more of the variation in the call model. PMID:20827036

  17. Stretching and Joint Mobilization Exercises Reduce Call-Center Operators’ Musculoskeletal Discomfort and Fatigue

    PubMed Central

    de Castro Lacaze, Denise Helena; Sacco, Isabel de C. N.; Rocha, Lys Esther; de Bragança Pereira, Carlos Alberto; Casarotto, Raquel Aparecida

    2010-01-01

    AIM: We sought to evaluate musculoskeletal discomfort and mental and physical fatigue in the call-center workers of an airline company before and after a supervised exercise program compared with rest breaks during the work shift. INTRODUCTION: This was a longitudinal pilot study conducted in a flight-booking call-center for an airline in São Paulo, Brazil. Occupational health activities are recommended to decrease the negative effects of the call-center working conditions. In practice, exercise programs are commonly recommended for computer workers, but their effects have not been studied in call-center operators. METHODS: Sixty-four call-center operators participated in this study. Thirty-two subjects were placed into the experimental group and attended a 10-min daily exercise session for 2 months. Conversely, 32 participants were placed into the control group and took a 10-min daily rest break during the same period. Each subject was evaluated once a week by means of the Corlett-Bishop body map with a visual analog discomfort scale and the Chalder fatigue questionnaire. RESULTS: Musculoskeletal discomfort decreased in both groups, but the reduction was only statistically significant for the spine and buttocks (p=0.04) and the sum of the segments (p=0.01) in the experimental group. In addition, the experimental group showed significant differences in the level of mental fatigue, especially in questions related to memory Rienzo, #181ff and tiredness (p=0.001). CONCLUSIONS: Our preliminary results demonstrate that appropriately designed and supervised exercise programs may be more efficient than rest breaks in decreasing discomfort and fatigue levels in call-center operators. PMID:20668622

  18. Recovery Support for Adolescents with Substance use Disorders: The Impact of Recovery Support Telephone Calls Provided by Pre-Professional Volunteers

    PubMed Central

    Garner, Bryan R; Godley, Mark D; Passetti, Lora L; Funk, Rodney R; White, William L

    2014-01-01

    The present quasi-experiment examined the direct and indirect effects of recovery support telephone calls following adolescent substance use disorder treatment. Six-month outcome data from 202 adolescents who had received recovery support calls from primarily pre-professional (i.e., college-level social service students) volunteers was compared to 6-month outcome data from a matched comparison sample of adolescents (n = 404). Results suggested adolescents in the recovery support sample had significantly greater reductions in their recovery environment risk relative to the comparison sample (β = -.17). Path analysis also suggested that the reduction in recovery environment risk produced by recovery support calls had indirect impacts (via recovery environment risk) on reductions in social risk (β = .22), substance use (β = .23), and substance-related problems (β = .16). Finally, moderation analyses suggested the effects of recovery support calls did not differ by gender, but were significantly greater for adolescents with lower levels of treatment readiness. In addition to providing rare empirical support for the effectiveness of recovery support services, an important contribution of this study is that it provides evidence that recovery support services do not necessarily have to be “peer-based,” at least in terms of the recovery support service provider having the experiential credentials of being “in recovery.” If replicated, this latter finding may have particularly important implications for helping increase the recovery support workforce. PMID:25574502

  19. The decline in vitamin research funding: a missed opportunity?

    USDA-ARS?s Scientific Manuscript database

    Background: The National Nutrition Research Roadmap has called for support of greater collaborative, interdisciplinary research for multiple areas of nutrition research. However, a substantial reduction in federal funding makes responding to these calls challenging. The objective of this study was t...

  20. A numerical study of different projection-based model reduction techniques applied to computational homogenisation

    NASA Astrophysics Data System (ADS)

    Soldner, Dominic; Brands, Benjamin; Zabihyan, Reza; Steinmann, Paul; Mergheim, Julia

    2017-10-01

    Computing the macroscopic material response of a continuum body commonly involves the formulation of a phenomenological constitutive model. However, the response is mainly influenced by the heterogeneous microstructure. Computational homogenisation can be used to determine the constitutive behaviour on the macro-scale by solving a boundary value problem at the micro-scale for every so-called macroscopic material point within a nested solution scheme. Hence, this procedure requires the repeated solution of similar microscopic boundary value problems. To reduce the computational cost, model order reduction techniques can be applied. An important aspect thereby is the robustness of the obtained reduced model. Within this study reduced-order modelling (ROM) for the geometrically nonlinear case using hyperelastic materials is applied for the boundary value problem on the micro-scale. This involves the Proper Orthogonal Decomposition (POD) for the primary unknown and hyper-reduction methods for the arising nonlinearity. Therein three methods for hyper-reduction, differing in how the nonlinearity is approximated and the subsequent projection, are compared in terms of accuracy and robustness. Introducing interpolation or Gappy-POD based approximations may not preserve the symmetry of the system tangent, rendering the widely used Galerkin projection sub-optimal. Hence, a different projection related to a Gauss-Newton scheme (Gauss-Newton with Approximated Tensors- GNAT) is favoured to obtain an optimal projection and a robust reduced model.

  1. Surface Heave Behaviour of Coir Geotextile Reinforced Sand Beds

    NASA Astrophysics Data System (ADS)

    Lal, Dharmesh; Sankar, N.; Chandrakaran, S.

    2017-06-01

    Soil reinforcement by natural fibers is one of the cheapest and attractive ground improvement techniques. Coir is the most abundant natural fiber available in India and due to its high lignin content; it has a larger life span than other natural fibers. It is widely used in India for erosion control purposes, but its use as a reinforcement material is rather limited. This study focuses on the use of coir geotextile as a reinforcement material to reduce surface heave phenomena occurring in shallow foundations. This paper presents the results of laboratory model tests carried out on square footings supported on coir geotextile reinforced sand beds. The influence of various parameters such as depth of reinforcement, length, and number of layers of reinforcement was studied. It was observed that surface heave is considerably reduced with the provision of geotextile. Heave reduction up to 98.7% can be obtained by the proposed method. Heave reduction is quantified by a non-dimensional parameter called heave reduction factor.

  2. Optimizing Performance Parameters of Chemically-Derived Graphene/p-Si Heterojunction Solar Cell.

    PubMed

    Batra, Kamal; Nayak, Sasmita; Behura, Sanjay K; Jani, Omkar

    2015-07-01

    Chemically-derived graphene have been synthesized by modified Hummers method and reduced using sodium borohydride. To explore the potential for photovoltaic applications, graphene/p-silicon (Si) heterojunction devices were fabricated using a simple and cost effective technique called spin coating. The SEM analysis shows the formation of graphene oxide (GO) flakes which become smooth after reduction. The absence of oxygen containing functional groups, as observed in FT-IR spectra, reveals the reduction of GO, i.e., reduced graphene oxide (rGO). It was further confirmed by Raman analysis, which shows slight reduction in G-band intensity with respect to D-band. Hall effect measurement confirmed n-type nature of rGO. Therefore, an effort has been made to simu- late rGO/p-Si heterojunction device by using the one-dimensional solar cell capacitance software, considering the experimentally derived parameters. The detail analysis of the effects of Si thickness, graphene thickness and temperature on the performance of the device has been presented.

  3. Guided SAR image despeckling with probabilistic non local weights

    NASA Astrophysics Data System (ADS)

    Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny

    2017-12-01

    SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.

  4. Interpretation and classification of microvolt T wave alternans tests

    NASA Technical Reports Server (NTRS)

    Bloomfield, Daniel M.; Hohnloser, Stefan H.; Cohen, Richard J.

    2002-01-01

    Measurement of microvolt-level T wave alternans (TWA) during routine exercise stress testing now is possible as a result of sophisticated noise reduction techniques and analytic methods that have become commercially available. Even though this technology is new, the available data suggest that microvolt TWA is a potent predictor of arrhythmia risk in diverse disease states. As this technology becomes more widely available, physicians will be called upon to interpret microvolt TWA tracings. This review seeks to establish uniform standards for the clinical interpretation of microvolt TWA tracings.

  5. 76 FR 44271 - Approval and Promulgation of Implementation Plans; Texas; Revisions to Permits by Rule and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-25

    ..., 1998, revision creates new section 116.116(f) allowing for the use of Discrete Emission Reduction... allows the use of Discrete Emission Reduction Credits (DERCs) to be used to exceed permit allowables and... credits (called discrete emission reduction credits, or DERCs, in the Texas program) by reducing its...

  6. 76 FR 67600 - Approval and Promulgation of Implementation Plans; Texas; Regulations for Control of Air...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-02

    ... July 22, 1998, revision allows for the use of Discrete Emission Reduction Credits (DERC) to exceed... creates a new section at 116.116(f) that allows Discrete Emission Reduction Credits (DERCs) to be used to...-term emission credits (called discrete emission reduction credits, or DERCs, in the Texas program) by...

  7. The Nevada Class Size Reduction Evaluation Study, 1995.

    ERIC Educational Resources Information Center

    Nevada State Dept. of Education, Carson City.

    A primary purpose for reducing the student-teacher ratio in the early grades is to make students more successful in their later years. This document contains two separate, but interrelated reports that examined two aspects of the 1989 Class Size Reduction (CSR) Act in Nevada. The Act called for a reduction in student-teacher ratios for selected…

  8. Differential Binary Encoding Method for Calibrating Image Sensors Based on IOFBs

    PubMed Central

    Fernández, Pedro R.; Lázaro-Galilea, José Luis; Gardel, Alfredo; Espinosa, Felipe; Bravo, Ignacio; Cano, Ángel

    2012-01-01

    Image transmission using incoherent optical fiber bundles (IOFBs) requires prior calibration to obtain the spatial in-out fiber correspondence necessary to reconstruct the image captured by the pseudo-sensor. This information is recorded in a Look-Up Table called the Reconstruction Table (RT), used later for reordering the fiber positions and reconstructing the original image. This paper presents a very fast method based on image-scanning using spaces encoded by a weighted binary code to obtain the in-out correspondence. The results demonstrate that this technique yields a remarkable reduction in processing time and the image reconstruction quality is very good compared to previous techniques based on spot or line scanning, for example. PMID:22666023

  9. Buckling Analysis of Single and Multi Delamination In Composite Beam Using Finite Element Method

    NASA Astrophysics Data System (ADS)

    Simanjorang, Hans Charles; Syamsudin, Hendri; Giri Suada, Muhammad

    2018-04-01

    Delamination is one type of imperfection in structure which found usually in the composite structure. Delamination may exist due to some factors namely in-service condition where the foreign objects hit the composite structure and creates inner defect and poor manufacturing that causes the initial imperfections. Composite structure is susceptible to the compressive loading. Compressive loading leads the instability phenomenon in the composite structure called buckling. The existence of delamination inside of the structure will cause reduction in buckling strength. This paper will explain the effect of delamination location to the buckling strength. The analysis will use the one-dimensional modelling approach using two- dimensional finite element method.

  10. Local spatiotemporal time-frequency peak filtering method for seismic random noise reduction

    NASA Astrophysics Data System (ADS)

    Liu, Yanping; Dang, Bo; Li, Yue; Lin, Hongbo

    2014-12-01

    To achieve a higher level of seismic random noise suppression, the Radon transform has been adopted to implement spatiotemporal time-frequency peak filtering (TFPF) in our previous studies. Those studies involved performing TFPF in full-aperture Radon domain, including linear Radon and parabolic Radon. Although the superiority of this method to the conventional TFPF has been tested through processing on synthetic seismic models and field seismic data, there are still some limitations in the method. Both full-aperture linear Radon and parabolic Radon are applicable and effective for some relatively simple situations (e.g., curve reflection events with regular geometry) but inapplicable for complicated situations such as reflection events with irregular shapes, or interlaced events with quite different slope or curvature parameters. Therefore, a localized approach to the application of the Radon transform must be applied. It would serve the filter method better by adapting the transform to the local character of the data variations. In this article, we propose an idea that adopts the local Radon transform referred to as piecewise full-aperture Radon to realize spatiotemporal TFPF, called local spatiotemporal TFPF. Through experiments on synthetic seismic models and field seismic data, this study demonstrates the advantage of our method in seismic random noise reduction and reflection event recovery for relatively complicated situations of seismic data.

  11. Local structure-based image decomposition for feature extraction with applications to face recognition.

    PubMed

    Qian, Jianjun; Yang, Jian; Xu, Yong

    2013-09-01

    This paper presents a robust but simple image feature extraction method, called image decomposition based on local structure (IDLS). It is assumed that in the local window of an image, the macro-pixel (patch) of the central pixel, and those of its neighbors, are locally linear. IDLS captures the local structural information by describing the relationship between the central macro-pixel and its neighbors. This relationship is represented with the linear representation coefficients determined using ridge regression. One image is actually decomposed into a series of sub-images (also called structure images) according to a local structure feature vector. All the structure images, after being down-sampled for dimensionality reduction, are concatenated into one super-vector. Fisher linear discriminant analysis is then used to provide a low-dimensional, compact, and discriminative representation for each super-vector. The proposed method is applied to face recognition and examined using our real-world face image database, NUST-RWFR, and five popular, publicly available, benchmark face image databases (AR, Extended Yale B, PIE, FERET, and LFW). Experimental results show the performance advantages of IDLS over state-of-the-art algorithms.

  12. Workshop on Jet Exhaust Noise Reduction for Tactical Aircraft - NASA Perspective

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.; Henderson, Brenda S.

    2007-01-01

    Jet noise from supersonic, high performance aircraft is a significant problem for takeoff and landing operations near air bases and aircraft carriers. As newer aircraft with higher thrust and performance are introduced, the noise tends to increase due to higher jet exhaust velocities. Jet noise has been a subject of research for over 55 years. Commercial subsonic aircraft benefit from changes to the engine cycle that reduce the exhaust velocities and result in significant noise reduction. Most of the research programs over the past few decades have concentrated on commercial aircraft. Progress has been made by introducing new engines with design features that reduce the noise. NASA has recently started a new program called "Fundamental Aeronautics" where three projects (subsonic fixed wing, subsonic rotary wing, and supersonics) address aircraft noise. For the supersonics project, a primary goal is to understand the underlying physics associated with jet noise so that improved noise prediction tools and noise reduction methods can be developed for a wide range of applications. Highlights from the supersonics project are presented including prediction methods for broadband shock noise, flow measurement methods, and noise reduction methods. Realistic expectations are presented based on past history that indicates significant jet noise reduction cannot be achieved without major changes to the engine cycle. NASA s past experience shows a few EPNdB (effective perceived noise level in decibels) can be achieved using low noise design features such as chevron nozzles. Minimal thrust loss can be expected with these nozzles (< 0.5%) and they may be retrofitted on existing engines. In the long term, it is desirable to use variable cycle engines that can be optimized for lower jet noise during takeoff operations and higher thrust for operational performance. It is also suggested that noise experts be included early in the design process for engine nozzle systems to participate in decisions that may impact the jet noise.

  13. On modelling three-dimensional piezoelectric smart structures with boundary spectral element method

    NASA Astrophysics Data System (ADS)

    Zou, Fangxin; Aliabadi, M. H.

    2017-05-01

    The computational efficiency of the boundary element method in elastodynamic analysis can be significantly improved by employing high-order spectral elements for boundary discretisation. In this work, for the first time, the so-called boundary spectral element method is utilised to formulate the piezoelectric smart structures that are widely used in structural health monitoring (SHM) applications. The resultant boundary spectral element formulation has been validated by the finite element method (FEM) and physical experiments. The new formulation has demonstrated a lower demand on computational resources and a higher numerical stability than commercial FEM packages. Comparing to the conventional boundary element formulation, a significant reduction in computational expenses has been achieved. In summary, the boundary spectral element formulation presented in this paper provides a highly efficient and stable mathematical tool for the development of SHM applications.

  14. Optimal Analyses for 3×n AB Games in the Worst Case

    NASA Astrophysics Data System (ADS)

    Huang, Li-Te; Lin, Shun-Shii

    The past decades have witnessed a growing interest in research on deductive games such as Mastermind and AB game. Because of the complicated behavior of deductive games, tree-search approaches are often adopted to find their optimal strategies. In this paper, a generalized version of deductive games, called 3×n AB games, is introduced. However, traditional tree-search approaches are not appropriate for solving this problem since it can only solve instances with smaller n. For larger values of n, a systematic approach is necessary. Therefore, intensive analyses of playing 3×n AB games in the worst case optimally are conducted and a sophisticated method, called structural reduction, which aims at explaining the worst situation in this game is developed in the study. Furthermore, a worthwhile formula for calculating the optimal numbers of guesses required for arbitrary values of n is derived and proven to be final.

  15. A top-down approach to heliostat cost reduction

    NASA Astrophysics Data System (ADS)

    Larmuth, James N.; Landamn, Willem A.; Gauché, Paul

    2016-05-01

    The Technology Innovation Agency (TIA) has funded a South African central receiver collector technology development project, called Helio100. The project aims to provide South Africa's first commercially viable heliostat technology, which is both low in cost and offers high local content potential. A top-down approach is employed for heliostat cost reduction. This approach incorporates interlinked tools which move from high level cost analyses based on qualitative data during early stages of conceptual design, to detailed quantitative analyses in the final stages of design. Low cost heliostat designs are realized by the incorporation of both a top-down and bottom-up method. The current H100 design results in heliostat costs of 155/m2 at 20 000 units p.a. while further industrialisation results in heliostat costs of 126/m2 at 20 000 units.

  16. On the Numerical Formulation of Parametric Linear Fractional Transformation (LFT) Uncertainty Models for Multivariate Matrix Polynomial Problems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.

    1998-01-01

    Robust control system analysis and design is based on an uncertainty description, called a linear fractional transformation (LFT), which separates the uncertain (or varying) part of the system from the nominal system. These models are also useful in the design of gain-scheduled control systems based on Linear Parameter Varying (LPV) methods. Low-order LFT models are difficult to form for problems involving nonlinear parameter variations. This paper presents a numerical computational method for constructing and LFT model for a given LPV model. The method is developed for multivariate polynomial problems, and uses simple matrix computations to obtain an exact low-order LFT representation of the given LPV system without the use of model reduction. Although the method is developed for multivariate polynomial problems, multivariate rational problems can also be solved using this method by reformulating the rational problem into a polynomial form.

  17. Simultaneous Genotype Calling and Haplotype Phasing Improves Genotype Accuracy and Reduces False-Positive Associations for Genome-wide Association Studies

    PubMed Central

    Browning, Brian L.; Yu, Zhaoxia

    2009-01-01

    We present a novel method for simultaneous genotype calling and haplotype-phase inference. Our method employs the computationally efficient BEAGLE haplotype-frequency model, which can be applied to large-scale studies with millions of markers and thousands of samples. We compare genotype calls made with our method to genotype calls made with the BIRDSEED, CHIAMO, GenCall, and ILLUMINUS genotype-calling methods, using genotype data from the Illumina 550K and Affymetrix 500K arrays. We show that our method has higher genotype-call accuracy and yields fewer uncalled genotypes than competing methods. We perform single-marker analysis of data from the Wellcome Trust Case Control Consortium bipolar disorder and type 2 diabetes studies. For bipolar disorder, the genotype calls in the original study yield 25 markers with apparent false-positive association with bipolar disorder at a p < 10−7 significance level, whereas genotype calls made with our method yield no associated markers at this significance threshold. Conversely, for markers with replicated association with type 2 diabetes, there is good concordance between genotype calls used in the original study and calls made by our method. Results from single-marker and haplotypic analysis of our method's genotype calls for the bipolar disorder study indicate that our method is highly effective at eliminating genotyping artifacts that cause false-positive associations in genome-wide association studies. Our new genotype-calling methods are implemented in the BEAGLE and BEAGLECALL software packages. PMID:19931040

  18. A Commercial Device Involving the Breathalyzer Test Reaction.

    ERIC Educational Resources Information Center

    Dombrink, Kathleen J.

    1996-01-01

    Describes the working of Final Call, a commercially available breath analyzing device, which uses the chemical reaction involving the reduction of chromium (VI) in the orange dichromate ion to the green chromium (III) ion to detect ethyl alcohol. Presents a demonstration that simulates the use of a Final Call device. (JRH)

  19. Dimension Reduction of Multivariable Optical Emission Spectrometer Datasets for Industrial Plasma Processes

    PubMed Central

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A new data dimension-reduction method, called Internal Information Redundancy Reduction (IIRR), is proposed for application to Optical Emission Spectroscopy (OES) datasets obtained from industrial plasma processes. For example in a semiconductor manufacturing environment, real-time spectral emission data is potentially very useful for inferring information about critical process parameters such as wafer etch rates, however, the relationship between the spectral sensor data gathered over the duration of an etching process step and the target process output parameters is complex. OES sensor data has high dimensionality (fine wavelength resolution is required in spectral emission measurements in order to capture data on all chemical species involved in plasma reactions) and full spectrum samples are taken at frequent time points, so that dynamic process changes can be captured. To maximise the utility of the gathered dataset, it is essential that information redundancy is minimised, but with the important requirement that the resulting reduced dataset remains in a form that is amenable to direct interpretation of the physical process. To meet this requirement and to achieve a high reduction in dimension with little information loss, the IIRR method proposed in this paper operates directly in the original variable space, identifying peak wavelength emissions and the correlative relationships between them. A new statistic, Mean Determination Ratio (MDR), is proposed to quantify the information loss after dimension reduction and the effectiveness of IIRR is demonstrated using an actual semiconductor manufacturing dataset. As an example of the application of IIRR in process monitoring/control, we also show how etch rates can be accurately predicted from IIRR dimension-reduced spectral data. PMID:24451453

  20. Sidelobe reduction and capacity improvement of open-loop collaborative beamforming in wireless sensor networks

    PubMed Central

    2017-01-01

    Collaborative beamforming (CBF) with a finite number of collaborating nodes (CNs) produces sidelobes that are highly dependent on the collaborating nodes’ locations. The sidelobes cause interference and affect the communication rate of unintended receivers located within the transmission range. Nulling is not possible in an open-loop CBF since the collaborating nodes are unable to receive feedback from the receivers. Hence, the overall sidelobe reduction is required to avoid interference in the directions of the unintended receivers. However, the impact of sidelobe reduction on the capacity improvement at the unintended receiver has never been reported in previous works. In this paper, the effect of peak sidelobe (PSL) reduction in CBF on the capacity of an unintended receiver is analyzed. Three meta-heuristic optimization methods are applied to perform PSL minimization, namely genetic algorithm (GA), particle swarm algorithm (PSO) and a simplified version of the PSO called the weightless swarm algorithm (WSA). An average reduction of 20 dB in PSL alongside 162% capacity improvement is achieved in the worst case scenario with the WSA optimization. It is discovered that the PSL minimization in the CBF provides capacity improvement at an unintended receiver only if the CBF cluster is small and dense. PMID:28464000

  1. Missing the target: including perspectives of women with overweight and obesity to inform stigma‐reduction strategies

    PubMed Central

    Himmelstein, M. S.; Gorin, A. A.; Suh, Y. J.

    2017-01-01

    Summary Objective Pervasive weight stigma and discrimination have led to ongoing calls for efforts to reduce this bias. Despite increasing research on stigma‐reduction strategies, perspectives of individuals who have experienced weight stigma have rarely been included to inform this research. The present study conducted a systematic examination of women with high body weight to assess their perspectives about a broad range of strategies to reduce weight‐based stigma. Methods Women with overweight or obesity (N = 461) completed an online survey in which they evaluated the importance, feasibility and potential impact of 35 stigma‐reduction strategies in diverse settings. Participants (91.5% who reported experiencing weight stigma) also completed self‐report measures assessing experienced and internalized weight stigma. Results Most participants assigned high importance to all stigma‐reduction strategies, with school‐based and healthcare approaches accruing the highest ratings. Adding weight stigma to existing anti‐harassment workplace training was rated as the most impactful and feasible strategy. The family environment was viewed as an important intervention target, regardless of participants' experienced or internalized stigma. Conclusion These findings underscore the importance of including people with stigmatized identities in stigma‐reduction research; their insights provide a necessary and valuable contribution that can inform ways to reduce weight‐based inequities and prioritize such efforts. PMID:28392929

  2. Flexible conformable hydrophobized surfaces for turbulent flow drag reduction

    PubMed Central

    Brennan, Joseph C; Geraldi, Nicasio R; Morris, Robert H; Fairhurst, David J; McHale, Glen; Newton, Michael I

    2015-01-01

    In recent years extensive work has been focused onto using superhydrophobic surfaces for drag reduction applications. Superhydrophobic surfaces retain a gas layer, called a plastron, when submerged underwater in the Cassie-Baxter state with water in contact with the tops of surface roughness features. In this state the plastron allows slip to occur across the surface which results in a drag reduction. In this work we report flexible and relatively large area superhydrophobic surfaces produced using two different methods: Large roughness features were created by electrodeposition on copper meshes; Small roughness features were created by embedding carbon nanoparticles (soot) into Polydimethylsiloxane (PDMS). Both samples were made into cylinders with a diameter under 12 mm. To characterize the samples, scanning electron microscope (SEM) images and confocal microscope images were taken. The confocal microscope images were taken with each sample submerged in water to show the extent of the plastron. The hydrophobized electrodeposited copper mesh cylinders showed drag reductions of up to 32% when comparing the superhydrophobic state with a wetted out state. The soot covered cylinders achieved a 30% drag reduction when comparing the superhydrophobic state to a plain cylinder. These results were obtained for turbulent flows with Reynolds numbers 10,000 to 32,500. PMID:25975704

  3. Computer-Aided Breast Cancer Diagnosis with Optimal Feature Sets: Reduction Rules and Optimization Techniques.

    PubMed

    Mathieson, Luke; Mendes, Alexandre; Marsden, John; Pond, Jeffrey; Moscato, Pablo

    2017-01-01

    This chapter introduces a new method for knowledge extraction from databases for the purpose of finding a discriminative set of features that is also a robust set for within-class classification. Our method is generic and we introduce it here in the field of breast cancer diagnosis from digital mammography data. The mathematical formalism is based on a generalization of the k-Feature Set problem called (α, β)-k-Feature Set problem, introduced by Cotta and Moscato (J Comput Syst Sci 67(4):686-690, 2003). This method proceeds in two steps: first, an optimal (α, β)-k-feature set of minimum cardinality is identified and then, a set of classification rules using these features is obtained. We obtain the (α, β)-k-feature set in two phases; first a series of extremely powerful reduction techniques, which do not lose the optimal solution, are employed; and second, a metaheuristic search to identify the remaining features to be considered or disregarded. Two algorithms were tested with a public domain digital mammography dataset composed of 71 malignant and 75 benign cases. Based on the results provided by the algorithms, we obtain classification rules that employ only a subset of these features.

  4. Approach and case-study of green infrastructure screening analysis for urban stormwater control.

    PubMed

    Eaton, Timothy T

    2018-03-01

    Urban stormwater control is an urgent concern in megacities where increased impervious surface has disrupted natural hydrology. Water managers are increasingly turning to more environmentally friendly ways of capturing stormwater, called Green Infrastructure (GI), to mitigate combined sewer overflow (CSO) that degrades local water quality. A rapid screening approach is described to evaluate how GI strategies can reduce the amount of stormwater runoff in a low-density residential watershed in New York City. Among multiple possible tools, the L-THIA LID online software package, using the SCS-CN method, was selected to estimate relative runoff reductions expected with different strategies in areas of different land uses in the watershed. Results are sensitive to the relative areas of different land uses, and show that bioretention and raingardens provide the maximum reduction (∼12%) in this largely residential watershed. Although commercial, industrial and high-density residential areas in the watershed are minor, larger runoff reductions from disconnection strategies and porous pavement in parking lots are also possible. Total stormwater reductions from various combinations of these strategies can reach 35-55% for individual land uses, and between 23% and 42% for the entire watershed. Copyright © 2017. Published by Elsevier Ltd.

  5. 76 FR 12117 - Call for Comments on the Draft Report of the Adult Immunization Working Group to the National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-04

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Call for Comments on the Draft Report of the Adult Immunization Working Group to the National Vaccine Advisory Committee on Adult Immunization: Complex Challenges..., national adult immunization program that will lead to vaccine-preventable disease reduction by improving...

  6. Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models

    DTIC Science & Technology

    2008-08-01

    Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool

  7. An Ultrasonographic Periodontal Probe

    NASA Astrophysics Data System (ADS)

    Bertoncini, C. A.; Hinders, M. K.

    2010-02-01

    Periodontal disease, commonly known as gum disease, affects millions of people. The current method of detecting periodontal pocket depth is painful, invasive, and inaccurate. As an alternative to manual probing, an ultrasonographic periodontal probe is being developed to use ultrasound echo waveforms to measure periodontal pocket depth, which is the main measure of periodontal disease. Wavelet transforms and pattern classification techniques are implemented in artificial intelligence routines that can automatically detect pocket depth. The main pattern classification technique used here, called a binary classification algorithm, compares test objects with only two possible pocket depth measurements at a time and relies on dimensionality reduction for the final determination. This method correctly identifies up to 90% of the ultrasonographic probe measurements within the manual probe's tolerance.

  8. The DEEP-South: Scheduling and Data Reduction Software System

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  9. Multiadaptive Bionic Wavelet Transform: Application to ECG Denoising and Baseline Wandering Reduction

    NASA Astrophysics Data System (ADS)

    Sayadi, Omid; Shamsollahi, Mohammad B.

    2007-12-01

    We present a new modified wavelet transform, called the multiadaptive bionic wavelet transform (MABWT), that can be applied to ECG signals in order to remove noise from them under a wide range of variations for noise. By using the definition of bionic wavelet transform and adaptively determining both the center frequency of each scale together with the[InlineEquation not available: see fulltext.]-function, the problem of desired signal decomposition is solved. Applying a new proposed thresholding rule works successfully in denoising the ECG. Moreover by using the multiadaptation scheme, lowpass noisy interference effects on the baseline of ECG will be removed as a direct task. The method was extensively clinically tested with real and simulated ECG signals which showed high performance of noise reduction, comparable to those of wavelet transform (WT). Quantitative evaluation of the proposed algorithm shows that the average SNR improvement of MABWT is 1.82 dB more than the WT-based results, for the best case. Also the procedure has largely proved advantageous over wavelet-based methods for baseline wandering cancellation, including both DC components and baseline drifts.

  10. Poisson-Box Sampling algorithms for three-dimensional Markov binary mixtures

    NASA Astrophysics Data System (ADS)

    Larmier, Coline; Zoia, Andrea; Malvagi, Fausto; Dumonteil, Eric; Mazzolo, Alain

    2018-02-01

    Particle transport in Markov mixtures can be addressed by the so-called Chord Length Sampling (CLS) methods, a family of Monte Carlo algorithms taking into account the effects of stochastic media on particle propagation by generating on-the-fly the material interfaces crossed by the random walkers during their trajectories. Such methods enable a significant reduction of computational resources as opposed to reference solutions obtained by solving the Boltzmann equation for a large number of realizations of random media. CLS solutions, which neglect correlations induced by the spatial disorder, are faster albeit approximate, and might thus show discrepancies with respect to reference solutions. In this work we propose a new family of algorithms (called 'Poisson Box Sampling', PBS) aimed at improving the accuracy of the CLS approach for transport in d-dimensional binary Markov mixtures. In order to probe the features of PBS methods, we will focus on three-dimensional Markov media and revisit the benchmark problem originally proposed by Adams, Larsen and Pomraning [1] and extended by Brantley [2]: for these configurations we will compare reference solutions, standard CLS solutions and the new PBS solutions for scalar particle flux, transmission and reflection coefficients. PBS will be shown to perform better than CLS at the expense of a reasonable increase in computational time.

  11. Hybridizable discontinuous Galerkin method for the 2-D frequency-domain elastic wave equations

    NASA Astrophysics Data System (ADS)

    Bonnasse-Gahot, Marie; Calandra, Henri; Diaz, Julien; Lanteri, Stéphane

    2018-04-01

    Discontinuous Galerkin (DG) methods are nowadays actively studied and increasingly exploited for the simulation of large-scale time-domain (i.e. unsteady) seismic wave propagation problems. Although theoretically applicable to frequency-domain problems as well, their use in this context has been hampered by the potentially large number of coupled unknowns they incur, especially in the 3-D case, as compared to classical continuous finite element methods. In this paper, we address this issue in the framework of the so-called hybridizable discontinuous Galerkin (HDG) formulations. As a first step, we study an HDG method for the resolution of the frequency-domain elastic wave equations in the 2-D case. We describe the weak formulation of the method and provide some implementation details. The proposed HDG method is assessed numerically including a comparison with a classical upwind flux-based DG method, showing better overall computational efficiency as a result of the drastic reduction of the number of globally coupled unknowns in the resulting discrete HDG system.

  12. Structural Noise and Acoustic Characteristics Improvement of Transport Power Plants

    NASA Astrophysics Data System (ADS)

    Chaynov, N. D.; Markov, V. A.; Savastenko, A. A.

    2018-03-01

    Noise reduction generated during the operation of various machines and mechanisms is an urgent task with regard to the power plants and, in particular, to internal combustion engines. Sound emission from the surfaces vibration of body parts is one of the main noise manifestations of the running engine and it is called a structural noise. The vibration defining of the outer surfaces of complex body parts and the calculation of their acoustic characteristics are determined with numerical methods. At the same time, realization of finite and boundary elements methods combination turned out to be very effective. The finite element method is used in calculating the structural elements vibrations, and the boundary elements method is used in the structural noise calculation. The main conditions of the methodology and the results of the structural noise analysis applied to a number of automobile engines are shown.

  13. 3D sensor placement strategy using the full-range pheromone ant colony system

    NASA Astrophysics Data System (ADS)

    Shuo, Feng; Jingqing, Jia

    2016-07-01

    An optimized sensor placement strategy will be extremely beneficial to ensure the safety and cost reduction considerations of structural health monitoring (SHM) systems. The sensors must be placed such that important dynamic information is obtained and the number of sensors is minimized. The practice is to select individual sensor directions by several 1D sensor methods and the triaxial sensors are placed in these directions for monitoring. However, this may lead to non-optimal placement of many triaxial sensors. In this paper, a new method, called FRPACS, is proposed based on the ant colony system (ACS) to solve the optimal placement of triaxial sensors. The triaxial sensors are placed as single units in an optimal fashion. And then the new method is compared with other algorithms using Dalian North Bridge. The computational precision and iteration efficiency of the FRPACS has been greatly improved compared with the original ACS and EFI method.

  14. LCAMP: Location Constrained Approximate Message Passing for Compressed Sensing MRI

    PubMed Central

    Sung, Kyunghyun; Daniel, Bruce L; Hargreaves, Brian A

    2016-01-01

    Iterative thresholding methods have been extensively studied as faster alternatives to convex optimization methods for solving large-sized problems in compressed sensing. A novel iterative thresholding method called LCAMP (Location Constrained Approximate Message Passing) is presented for reducing computational complexity and improving reconstruction accuracy when a nonzero location (or sparse support) constraint can be obtained from view shared images. LCAMP modifies the existing approximate message passing algorithm by replacing the thresholding stage with a location constraint, which avoids adjusting regularization parameters or thresholding levels. This work is first compared with other conventional reconstruction methods using random 1D signals and then applied to dynamic contrast-enhanced breast MRI to demonstrate the excellent reconstruction accuracy (less than 2% absolute difference) and low computation time (5 - 10 seconds using Matlab) with highly undersampled 3D data (244 × 128 × 48; overall reduction factor = 10). PMID:23042658

  15. Delivery of dsRNA through topical feeding for RNA interference in the citrus sap piercing-sucking hemipteran, Diaphorina citri.

    PubMed

    Killiny, Nabil; Kishk, Abdelaziz

    2017-06-01

    RNA interference (RNAi) is a powerful means to study functional genomics in insects. The delivery of dsRNA is a challenging step in the development of RNAi assay. Here, we describe a new delivery method to increase the effectiveness of RNAi in the Asian citrus psyllid Diaphorina citri. Bromophenol blue droplets were topically applied to fifth instar nymphs and adults on the ventral side of the thorax between the three pairs of legs. In addition to video recordings that showed sucking of the bromophenol blue by the stylets, dissected guts turned blue indicating that the uptake was through feeding. Thus, we called the method topical feeding. We targeted the abnormal wing disc gene (awd), also called nucleoside diphosphate kinase (NDPK), as a reporter gene to prove the uptake of dsRNA via this method of delivery. Our results showed that dsRNA-awd caused reduction of awd expression and nymph mortality. Survival and lifespan of adults emerged from treated nymphs and treated adults were affected. Silencing awd caused wing malformation in the adults emerged from treated nymphs. Topical feeding as a delivery of dsRNA is highly efficient for both nymphs and adults. The described method could be used to increase the efficiency of RNAi in D. citri and other sap piercing-sucking hemipterans. © 2017 Wiley Periodicals, Inc.

  16. New Research Findings Since the 2007 Surgeon General’s Call to Action to Prevent and Reduce Underage Drinking: A Review

    PubMed Central

    Hingson, Ralph; White, Aaron

    2014-01-01

    Objective: In 2007, the U.S. Department of Health and Human Services issued The Surgeon General’s Call To Action To Prevent And Reduce Underage Drinking, a publication documenting a problem linked to nearly 5,000 injury deaths annually and poor academic performance, potential cognitive deficits, risky sexual behavior, physical and sexual assaults, and other substance use. This report reviews subsequent underage drinking and related traffic fatality trends and research on determinants, consequences, and prevention interventions. Method: New research reports, meta-analyses, and systematic literature reviews were examined. Results: Since the Call to Action, reductions in underage frequency of drinking, heavy drinking occasions, and alcohol-related traffic deaths that began in the 1980s when the drinking age nationally became 21 have continued. Knowledge regarding determinants and consequences, particularly the effects of early-onset drinking, parental alcohol provision, and cognitive effects, has expanded. Additional studies support associations between the legal drinking age of 21, zero tolerance laws, higher alcohol prices, and reduced drinking and related problems. New research suggests that use/lose laws, social host liability, internal possession laws, graduated licensing, and night driving restrictions reduce traffic deaths involving underage drinking drivers. Additional studies support the positive effects of individually oriented interventions, especially screening and brief motivational interventions, web and face-to-face social norms interventions, college web-based interventions, parental interventions, and multicomponent community interventions. Conclusions: Despite reductions in underage alcohol consumption and related traffic deaths, underage drinking remains an enduring problem. Continued research is warranted in minimally studied areas, such as prospective studies of alcohol and brain development, policy studies of use/lose laws, internal possession laws, social host liability, and parent–family interventions. PMID:24411808

  17. Synthesis of ferromagnetic nanoparticles, formic acid oxidation catalyst nanocomposites, and late-transition metal-boride intermetallics by unique synthetic methods and single-source precursors

    NASA Astrophysics Data System (ADS)

    Wellons, Matthew S.

    The design, synthesis, and characterization of magnetic alloy nanoparticles, supported formic acid oxidation catalysts, and superhard intermetallic composites are presented. Ferromagnetic equatomic alloy nanoparticles of FePt, FePd, and CoPt were synthesized utilizing single-source heteronuclear organometallic precursors supported on an inert water-soluble matrix. Direct conversion of the precursor-support composite to supported ferromagnetic nanoparticles occurs under elevated temperatures and reducing conditions with metal-ion reduction and minimal nanoparticle coalescence. Nanoparticles were easily extracted from the support by addition of water and characterized in structure and magnetic properties. Palladium and platinum based nanoparticles were synthesized with microwave-based and chemical metal-ion reduction strategies, respectively, and tested for catalytic performance in a direct formic acid fuel cell (DFAFC). A study of palladium carbide nanocomposites with various carbonaceous supports was conducted and demonstrated strong activity comparable to commercially available palladium black, but poor catalytic longevity. Platinum-lead alloy nanocomposites synthesized with chemical reduction and supported on Vulcan carbon demonstrated strong activity, excellent catalytic longevity, and were subsequently incorporated into a prototype DFAFC. A new method for the synthesis of superhard ceramics on polymer substrates called Confined Plasma Chemical Deposition (CPCD) was developed. The CPCD method utilizes a tuned Free Electron Laser to selectively decompose the single-source precursor, Re(CO)4(B3H8), in a plasma-like state resulting in the superhard intermetallic ReB2 deposited on polymer substrates. Extension of this method to the synthesis of other hard of superhard ceramics; WB4, RuB2, and B4C was demonstrated. These three areas of research show new synthetic methods and novel materials of technological importance, resulting in a substantial advance in their respective fields.

  18. Artificial Neural Identification and LMI Transformation for Model Reduction-Based Control of the Buck Switch-Mode Regulator

    NASA Astrophysics Data System (ADS)

    Al-Rabadi, Anas N.

    2009-10-01

    This research introduces a new method of intelligent control for the control of the Buck converter using newly developed small signal model of the pulse width modulation (PWM) switch. The new method uses supervised neural network to estimate certain parameters of the transformed system matrix [Ã]. Then, a numerical algorithm used in robust control called linear matrix inequality (LMI) optimization technique is used to determine the permutation matrix [P] so that a complete system transformation {[B˜], [C˜], [Ẽ]} is possible. The transformed model is then reduced using the method of singular perturbation, and state feedback control is applied to enhance system performance. The experimental results show that the new control methodology simplifies the model in the Buck converter and thus uses a simpler controller that produces the desired system response for performance enhancement.

  19. An Interactive Procedure to Preserve the Desired Edges during the Image Processing of Noise Reduction

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Yu; Huang, Hsuan-Yu; Lee, Lin-Tsang

    2010-12-01

    The paper propose a new procedure including four stages in order to preserve the desired edges during the image processing of noise reduction. A denoised image can be obtained from a noisy image at the first stage of the procedure. At the second stage, an edge map can be obtained by the Canny edge detector to find the edges of the object contours. Manual modification of an edge map at the third stage is optional to capture all the desired edges of the object contours. At the final stage, a new method called Edge Preserved Inhomogeneous Diffusion Equation (EPIDE) is used to smooth the noisy images or the previously denoised image at the first stage for achieving the edge preservation. The Optical Character Recognition (OCR) results in the experiments show that the proposed procedure has the best recognition result because of the capability of edge preservation.

  20. Analysis and evaluation of process and equipment in tasks 2 and 4 of the Low Cost Solar Array project

    NASA Technical Reports Server (NTRS)

    Goldman, H.; Wolf, M.

    1978-01-01

    Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that substantial cost reductions can be realized from technical advancements which fall into four categories: an increase in furnace productivity; the reduction of crucible cost through use of the crucible for the equivalent of multiple state-of-the-art crystals; the combined effect of several smaller technical improvements; and a carry over effect of the expected availability of semiconductor grade polysilicon at greatly reduced prices. A format for techno-economic analysis of solar cell production processes was developed, called the University of Pennsylvania Process Characterization (UPPC) format. The accumulated Cz process data are presented.

  1. Review of Research into the Concept of the Microblowing Technique for Turbulent Skin Friction Reduction

    NASA Technical Reports Server (NTRS)

    2004-01-01

    A new technology for reducing turbulent skin friction, called the Microblowing Technique (MBT), is presented. Results from proof-of-concept experiments show that this technology could potentially reduce turbulent skin friction by more than 50% of the skin friction of a solid flat plate for subsonic and supersonic flow conditions. The primary purpose of this review paper is to provide readers with information on the turbulent skin friction reduction obtained from many experiments using the MBT. Although the MBT has a penalty for obtaining the microblowing air associated with it, some combinations of the MBT with suction boundary layer control methods are an attractive alternative for a real application. Several computational simulations to understand the flow physics of the MBT are also included. More experiments and computational fluid dynamics (CFD) computations are needed for the understanding of the unsteady flow nature of the MBT and the optimization of this new technology.

  2. Computer program developed for flowsheet calculations and process data reduction

    NASA Technical Reports Server (NTRS)

    Alfredson, P. G.; Anastasia, L. J.; Knudsen, I. E.; Koppel, L. B.; Vogel, G. J.

    1969-01-01

    Computer program PACER-65, is used for flowsheet calculations and easily adapted to process data reduction. Each unit, vessel, meter, and processing operation in the overall flowsheet is represented by a separate subroutine, which the program calls in the order required to complete an overall flowsheet calculation.

  3. Improved Kalman Filter Method for Measurement Noise Reduction in Multi Sensor RFID Systems

    PubMed Central

    Eom, Ki Hwan; Lee, Seung Joon; Kyung, Yeo Sun; Lee, Chang Won; Kim, Min Chul; Jung, Kyung Kwon

    2011-01-01

    Recently, the range of available Radio Frequency Identification (RFID) tags has been widened to include smart RFID tags which can monitor their varying surroundings. One of the most important factors for better performance of smart RFID system is accurate measurement from various sensors. In the multi-sensing environment, some noisy signals are obtained because of the changing surroundings. We propose in this paper an improved Kalman filter method to reduce noise and obtain correct data. Performance of Kalman filter is determined by a measurement and system noise covariance which are usually called the R and Q variables in the Kalman filter algorithm. Choosing a correct R and Q variable is one of the most important design factors for better performance of the Kalman filter. For this reason, we proposed an improved Kalman filter to advance an ability of noise reduction of the Kalman filter. The measurement noise covariance was only considered because the system architecture is simple and can be adjusted by the neural network. With this method, more accurate data can be obtained with smart RFID tags. In a simulation the proposed improved Kalman filter has 40.1%, 60.4% and 87.5% less Mean Squared Error (MSE) than the conventional Kalman filter method for a temperature sensor, humidity sensor and oxygen sensor, respectively. The performance of the proposed method was also verified with some experiments. PMID:22346641

  4. Improved Kalman filter method for measurement noise reduction in multi sensor RFID systems.

    PubMed

    Eom, Ki Hwan; Lee, Seung Joon; Kyung, Yeo Sun; Lee, Chang Won; Kim, Min Chul; Jung, Kyung Kwon

    2011-01-01

    Recently, the range of available radio frequency identification (RFID) tags has been widened to include smart RFID tags which can monitor their varying surroundings. One of the most important factors for better performance of smart RFID system is accurate measurement from various sensors. In the multi-sensing environment, some noisy signals are obtained because of the changing surroundings. We propose in this paper an improved Kalman filter method to reduce noise and obtain correct data. Performance of Kalman filter is determined by a measurement and system noise covariance which are usually called the R and Q variables in the Kalman filter algorithm. Choosing a correct R and Q variable is one of the most important design factors for better performance of the Kalman filter. For this reason, we proposed an improved Kalman filter to advance an ability of noise reduction of the Kalman filter. The measurement noise covariance was only considered because the system architecture is simple and can be adjusted by the neural network. With this method, more accurate data can be obtained with smart RFID tags. In a simulation the proposed improved Kalman filter has 40.1%, 60.4% and 87.5% less mean squared error (MSE) than the conventional Kalman filter method for a temperature sensor, humidity sensor and oxygen sensor, respectively. The performance of the proposed method was also verified with some experiments.

  5. A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies

    PubMed Central

    Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.

    2008-01-01

    Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969

  6. Initial Reductive Reactions in Aerobic Microbial Metabolism of 2,4,6-Trinitrotoluene

    PubMed Central

    Vorbeck, Claudia; Lenke, Hiltrud; Fischer, Peter; Spain, Jim C.; Knackmuss, Hans-Joachim

    1998-01-01

    Because of its high electron deficiency, initial microbial transformations of 2,4,6-trinitrotoluene (TNT) are characterized by reductive rather than oxidation reactions. The reduction of the nitro groups seems to be the dominating mechanism, whereas hydrogenation of the aromatic ring, as described for picric acid, appears to be of minor importance. Thus, two bacterial strains enriched with TNT as a sole source of nitrogen under aerobic conditions, a gram-negative strain called TNT-8 and a gram-positive strain called TNT-32, carried out nitro-group reduction. In contrast, both a picric acid-utilizing Rhodococcus erythropolis strain, HL PM-1, and a 4-nitrotoluene-utilizing Mycobacterium sp. strain, HL 4-NT-1, possessed reductive enzyme systems, which catalyze ring hydrogenation, i.e., the addition of a hydride ion to the aromatic ring of TNT. The hydride-Meisenheimer complex thus formed (H−-TNT) was further converted to a yellow metabolite, which by electrospray mass and nuclear magnetic resonance spectral analyses was established as the protonated dihydride-Meisenheimer complex of TNT (2H−-TNT). Formation of hydride complexes could not be identified with the TNT-enriched strains TNT-8 and TNT-32, or with Pseudomonas sp. clone A (2NT−), for which such a mechanism has been proposed. Correspondingly, reductive denitration of TNT did not occur. PMID:16349484

  7. Development of a module for Cost-Benefit analysis of risk reduction measures for natural hazards for the CHANGES-SDSS platform

    NASA Astrophysics Data System (ADS)

    Berlin, Julian; Bogaard, Thom; Van Westen, Cees; Bakker, Wim; Mostert, Eric; Dopheide, Emile

    2014-05-01

    Cost benefit analysis (CBA) is a well know method used widely for the assessment of investments either in the private and public sector. In the context of risk mitigation and the evaluation of risk reduction alternatives for natural hazards its use is very important to evaluate the effectiveness of such efforts in terms of avoided monetary losses. However the current method has some disadvantages related to the spatial distribution of the costs and benefits, the geographical distribution of the avoided damage and losses, the variation in areas that are benefited in terms of invested money and avoided monetary risk. Decision-makers are often interested in how the costs and benefits are distributed among different administrative units of a large area or region, so they will be able to compare and analyse the cost and benefits per administrative unit as a result of the implementation of the risk reduction projects. In this work we first examined the Cost benefit procedure for natural hazards, how the costs are assessed for several structural and non-structural risk reduction alternatives, we also examined the current problems of the method such as the inclusion of cultural and social considerations that are complex to monetize , the problem of discounting future values using a defined interest rate and the spatial distribution of cost and benefits. We also examined the additional benefits and the indirect costs associated with the implementation of the risk reduction alternatives such as the cost of having a ugly landscape (also called negative benefits). In the last part we examined the current tools and software used in natural hazards assessment with support to conduct CBA and we propose design considerations for the implementation of the CBA module for the CHANGES-SDSS Platform an initiative of the ongoing 7th Framework Programme "CHANGES of the European commission. Keywords: Risk management, Economics of risk mitigation, EU Flood Directive, resilience, prevention, cost benefit analysis, spatial distribution of costs and benefits

  8. Toda hierarchies and their applications

    NASA Astrophysics Data System (ADS)

    Takasaki, Kanehisa

    2018-05-01

    The 2D Toda hierarchy occupies a central position in the family of integrable hierarchies of the Toda type. The 1D Toda hierarchy and the Ablowitz–Ladik (aka relativistic Toda) hierarchy can be derived from the 2D Toda hierarchy as reductions. These integrable hierarchies have been applied to various problems of mathematics and mathematical physics since 1990s. A recent example is a series of studies on models of statistical mechanics called the melting crystal model. This research has revealed that the aforementioned two reductions of the 2D Toda hierarchy underlie two different melting crystal models. Technical clues are a fermionic realization of the quantum torus algebra, special algebraic relations therein called shift symmetries, and a matrix factorization problem. The two melting crystal models thus exhibit remarkable similarity with the Hermitian and unitary matrix models for which the two reductions of the 2D Toda hierarchy play the role of fundamental integrable structures.

  9. Routh reduction and Cartan mechanics

    NASA Astrophysics Data System (ADS)

    Capriotti, S.

    2017-04-01

    In the present work a Cartan mechanics version for Routh reduction is considered, as an intermediate step towards Routh reduction in field theory. Motivation for this generalization comes from a scheme for integrable systems (Fehér and Gábor, 2002), used for understanding the occurrence of Toda field theories in so called Hamiltonian reduction of WZNW field theories (Fehér et al., 1992). As a way to accomplish with this intermediate aim, this article also contains a formulation of the Lagrangian Adler-Kostant-Symes systems discussed in Fehér and Gábor (2002) in terms of Routh reduction.

  10. Disrupting Mating Behavior of Diaphorina citri (Liviidae).

    PubMed

    Lujo, S; Hartman, E; Norton, K; Pregmon, E A; Rohde, B B; Mankin, R W

    2016-12-01

    Severe economic damage from citrus greening disease, caused by 'Candidatus Liberibacter asiaticus' bacteria, has stimulated development of methods to reduce mating and reproduction in populations of its insect vector, Diaphorina citri (Hemiptera: Liviidae). Male D. citri find mating partners by walking on host plants, intermittently producing vibrational calls that stimulate duetting replies by receptive females. The replies provide orientational feedback, assisting the search process. To test a hypothesis that D. citri mating can be disrupted using vibrational signals that compete with and/or mask female replies, courtship bioassays were conducted in citrus trees with or without interference from female reply mimics produced by a vibrating buzzer. Statistically significant reductions occurred in the rates and proportions of mating when the buzzer produced reply mimics within 0.4 s after male courtship calls compared with undisturbed controls. Observations of courtship behaviors in the two bioassays revealed activity patterns that likely contributed to the reductions. In both disruption and control tests, males reciprocated frequently between structural bifurcations and other transition points where signal amplitudes changed. Males in the disruption bioassay had to select among vibrational signals combined from the buzzer and the female at each transition point. They often turned towards the buzzer instead of the female. There was a statistically significant reduction in the proportion of males mating if they contacted the buzzer, possibly due to its higher vibration amplitude and duration in comparison with female replies. Potential applications of D. citri mating disruption technology in citrus groves are discussed. Published by Oxford University Press on behalf of Entomological Society of America 2016. This work is written by US Government employees and is in the public domain in the US.

  11. Reduction of streamflow monitoring networks by a reference point approach

    NASA Astrophysics Data System (ADS)

    Cetinkaya, Cem P.; Harmancioglu, Nilgun B.

    2014-05-01

    Adoption of an integrated approach to water management strongly forces policy and decision-makers to focus on hydrometric monitoring systems as well. Existing hydrometric networks need to be assessed and revised against the requirements on water quantity data to support integrated management. One of the questions that a network assessment study should resolve is whether a current monitoring system can be consolidated in view of the increased expenditures in time, money and effort imposed on the monitoring activity. Within the last decade, governmental monitoring agencies in Turkey have foreseen an audit on all their basin networks in view of prevailing economic pressures. In particular, they question how they can decide whether monitoring should be continued or terminated at a particular site in a network. The presented study is initiated to address this question by examining the applicability of a method called “reference point approach” (RPA) for network assessment and reduction purposes. The main objective of the study is to develop an easily applicable and flexible network reduction methodology, focusing mainly on the assessment of the “performance” of existing streamflow monitoring networks in view of variable operational purposes. The methodology is applied to 13 hydrometric stations in the Gediz Basin, along the Aegean coast of Turkey. The results have shown that the simplicity of the method, in contrast to more complicated computational techniques, is an asset that facilitates the involvement of decision makers in application of the methodology for a more interactive assessment procedure between the monitoring agency and the network designer. The method permits ranking of hydrometric stations with regard to multiple objectives of monitoring and the desired attributes of the basin network. Another distinctive feature of the approach is that it also assists decision making in cases with limited data and metadata. These features of the RPA approach highlight its advantages over the existing network assessment and reduction methods.

  12. Systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a service representative

    DOEpatents

    Harris, Scott H.; Johnson, Joel A.; Neiswanger, Jeffery R.; Twitchell, Kevin E.

    2004-03-09

    The present invention includes systems configured to distribute a telephone call, communication systems, communication methods and methods of routing a telephone call to a customer service representative. In one embodiment of the invention, a system configured to distribute a telephone call within a network includes a distributor adapted to connect with a telephone system, the distributor being configured to connect a telephone call using the telephone system and output the telephone call and associated data of the telephone call; and a plurality of customer service representative terminals connected with the distributor and a selected customer service representative terminal being configured to receive the telephone call and the associated data, the distributor and the selected customer service representative terminal being configured to synchronize, application of the telephone call and associated data from the distributor to the selected customer service representative terminal.

  13. Multimodal observational assessment of quality and productivity benefits from the implementation of wireless technology for out of hours working

    PubMed Central

    Blakey, John D; Guy, Debbie; Simpson, Carl; Fearn, Andrew; Cannaby, Sharon; Wilson, Petra

    2012-01-01

    Objectives The authors investigated if a wireless system of call handling and task management for out of hours care could replace a standard pager-based system and improve markers of efficiency, patient safety and staff satisfaction. Design Prospective assessment using both quantitative and qualitative methods, including interviews with staff, a standard satisfaction questionnaire, independent observation, data extraction from work logs and incident reporting systems and analysis of hospital committee reports. Setting A large teaching hospital in the UK. Participants Hospital at night co-ordinators, clinical support workers and junior doctors handling approximately 10 000 tasks requested out of hours per month. Outcome measures Length of hospital stay, incidents reported, co-ordinator call logging activity, user satisfaction questionnaire, staff interviews. Results Users were more satisfied with the new system (satisfaction score 62/90 vs 82/90, p=0.0080). With the new system over 70 h/week of co-ordinator time was released, and there were fewer untoward incidents related to handover and medical response (OR=0.30, p=0.02). Broad clinical measures (cardiac arrest calls for peri-arrest situations and length of hospital stay) improved significantly in the areas covered by the new system. Conclusions The introduction of call handling software and mobile technology over a medical-grade wireless network improved staff satisfaction with the Hospital at Night system. Improvements in efficiency and information flow have been accompanied by a reduction in untoward incidents, length of stay and peri-arrest calls. PMID:22466035

  14. Modal ring method for the scattering of sound

    NASA Technical Reports Server (NTRS)

    Baumeister, Kenneth J.; Kreider, Kevin L.

    1993-01-01

    The modal element method for acoustic scattering can be simplified when the scattering body is rigid. In this simplified method, called the modal ring method, the scattering body is represented by a ring of triangular finite elements forming the outer surface. The acoustic pressure is calculated at the element nodes. The pressure in the infinite computational region surrounding the body is represented analytically by an eigenfunction expansion. The two solution forms are coupled by the continuity of pressure and velocity on the body surface. The modal ring method effectively reduces the two-dimensional scattering problem to a one-dimensional problem capable of handling very high frequency scattering. In contrast to the boundary element method or the method of moments, which perform a similar reduction in problem dimension, the model line method has the added advantage of having a highly banded solution matrix requiring considerably less computer storage. The method shows excellent agreement with analytic results for scattering from rigid circular cylinders over a wide frequency range (1 is equal to or less than ka is less than or equal to 100) in the near and far fields.

  15. A Web-Based Mindfulness Stress Management Program in a Corporate Call Center

    PubMed Central

    Allexandre, Didier; Bernstein, Adam M.; Walker, Esteban; Hunter, Jennifer; Roizen, Michael F.; Morledge, Thomas J.

    2016-01-01

    Objective: The objective of this study is to determine the effectiveness of an 8-week web-based, mindfulness stress management program (WSM) in a corporate call center and added benefit of group support. Methods: One hundred sixty-one participants were randomized to WSM, WSM with group support, WSM with group and expert clinical support, or wait-list control. Perceived stress, burnout, emotional and psychological well-being, mindfulness, and productivity were measured at baseline, weeks 8 and 16, and 1 year. Results: Online usage was low with participants favoring CD use and group practice. All active groups demonstrated significant reductions in perceived stress and increases in emotional and psychological well-being compared with control. Group support improved participation, engagement, and outcomes. Conclusion: A self-directed mindfulness program with group practice and support can provide an affordable, effective, and scalable workplace stress management solution. Engagement may also benefit from combining web-based and traditional CD delivery. PMID:26949875

  16. Linear reduction methods for tag SNP selection.

    PubMed

    He, Jingwu; Zelikovsky, Alex

    2004-01-01

    It is widely hoped that constructing a complete human haplotype map will help to associate complex diseases with certain SNP's. Unfortunately, the number of SNP's is huge and it is very costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNP's that should be sequenced to considerably small number of informative representatives, so called tag SNP's. In this paper, we propose a new linear algebra based method for selecting and using tag SNP's. Our method is purely combinatorial and can be combined with linkage disequilibrium (LD) and block based methods. We measure the quality of our tag SNP selection algorithm by comparing actual SNP's with SNP's linearly predicted from linearly chosen tag SNP's. We obtain an extremely good compression and prediction rates. For example, for long haplotypes (>25000 SNP's), knowing only 0.4% of all SNP's we predict the entire unknown haplotype with 2% accuracy while the prediction method is based on a 10% sample of the population.

  17. House Calls: California Program For Homebound Patients Reduces Monthly Spending, Delivers Meaningful Care.

    PubMed

    Melnick, Glenn A; Green, Lois; Rich, Jeremy

    2016-01-01

    In 2009 HealthCare Partners Affiliates Medical Group, based in Southern California, launched House Calls, an in-home program that provides, coordinates, and manages care primarily for recently discharged high-risk, frail, and psychosocially compromised patients. Its purpose is to reduce preventable emergency department visits and hospital readmissions. We present data over time from this well-established program to provide an example for other new programs that are being established across the United States to serve this population with complex needs. The findings show that the initial House Calls structure, staffing patterns, and processes differed across the geographic areas that it served, and that they also evolved over time in different ways. In the same time period, all areas experienced a reduction in operating costs per patient and showed substantial reductions in monthly per patient health care spending and hospital utilization after enrollment in the House Calls program, compared to the period before enrollment. Despite more than five years of experience, the program structure continues to evolve and adjust staffing and other features to accommodate the dynamic nature of this complex patient population. Project HOPE—The People-to-People Health Foundation, Inc.

  18. A blind source separation approach for humpback whale song separation.

    PubMed

    Zhang, Zhenbin; White, Paul R

    2017-04-01

    Many marine mammal species are highly social and are frequently encountered in groups or aggregations. When conducting passive acoustic monitoring in such circumstances, recordings commonly contain vocalizations of multiple individuals which overlap in time and frequency. This paper considers the use of blind source separation as a method for processing these recordings to separate the calls of individuals. The example problem considered here is that of the songs of humpback whales. The high levels of noise and long impulse responses can make source separation in underwater contexts a challenging proposition. The approach present here is based on time-frequency masking, allied to a noise reduction process. The technique is assessed using simulated and measured data sets, and the results demonstrate the effectiveness of the method for separating humpback whale songs.

  19. Evaluation of TCDD biodegradability under different redox conditions.

    PubMed

    Kao, C M; Chen, S C; Liu, J K; Wu, M J

    2001-09-01

    Polychlorinated dibenzo-p-dioxins have been generated as unwanted by-products in many industrial processes. Although their widespread distribution in different environmental compartments has been recognized, little is known about their fate in the ultimate environment sinks. The highly stable dioxin isomer 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) has been called the most toxic compound known to man. In this laboratory microcosm study, TCDD bioavailability was evaluated under five reduction/oxidation (redox) conditions including aerobic biodegradation, aerobic cometabolism, methanogenesis, iron reduction, and reductive dechlorination. Activated sludge and aquifer sediments from a TCDD and a pentachlorophenol (PCP) contaminated site were used as the inocula. Acetate, sludge cake, and cane molasses were used as the primary substrates (carbon sources) in cometabolism and reductive dechlorination microcosms. After a 90-day incubation period, microcosms constructed under reductive dechlorination conditions were the only treatment showing promising remediation results. The highest TCDD degradation rate [up to 86% of TCDD removal (with an initial concentration of 96 microg/kg of soil)] was observed in the microcosms with anaerobic activated sludge as the microbial inocula and sludge cakes as the primary substrates. Except for reductive dechlorination microcosms, no significant TCDD removal was observed in the microcosms prepared under other conditions. Thus, application of an effective primary substrate to enhance the reductive dechlorination process is a feasible method for TCDD bioremediation. Bioremediation expense can be significantly reduced by the supplement of some less expensive alternative substrates (e.g., sludge cakes, cane molasses). Results would be useful in designing a scale-up in situ or on-site bioremediation system such as bioslurry reactor for field application.

  20. Accurate RNA 5-methylcytosine site prediction based on heuristic physical-chemical properties reduction and classifier ensemble.

    PubMed

    Zhang, Ming; Xu, Yan; Li, Lei; Liu, Zi; Yang, Xibei; Yu, Dong-Jun

    2018-06-01

    RNA 5-methylcytosine (m 5 C) is an important post-transcriptional modification that plays an indispensable role in biological processes. The accurate identification of m 5 C sites from primary RNA sequences is especially useful for deeply understanding the mechanisms and functions of m 5 C. Due to the difficulty and expensive costs of identifying m 5 C sites with wet-lab techniques, developing fast and accurate machine-learning-based prediction methods is urgently needed. In this study, we proposed a new m 5 C site predictor, called M5C-HPCR, by introducing a novel heuristic nucleotide physicochemical property reduction (HPCR) algorithm and classifier ensemble. HPCR extracts multiple reducts of physical-chemical properties for encoding discriminative features, while the classifier ensemble is applied to integrate multiple base predictors, each of which is trained based on a separate reduct of the physical-chemical properties obtained from HPCR. Rigorous jackknife tests on two benchmark datasets demonstrate that M5C-HPCR outperforms state-of-the-art m 5 C site predictors, with the highest values of MCC (0.859) and AUC (0.962). We also implemented the webserver of M5C-HPCR, which is freely available at http://cslab.just.edu.cn:8080/M5C-HPCR/. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. A two-stage linear discriminant analysis via QR-decomposition.

    PubMed

    Ye, Jieping; Li, Qi

    2005-06-01

    Linear Discriminant Analysis (LDA) is a well-known method for feature extraction and dimension reduction. It has been used widely in many applications involving high-dimensional data, such as image and text classification. An intrinsic limitation of classical LDA is the so-called singularity problems; that is, it fails when all scatter matrices are singular. Many LDA extensions were proposed in the past to overcome the singularity problems. Among these extensions, PCA+LDA, a two-stage method, received relatively more attention. In PCA+LDA, the LDA stage is preceded by an intermediate dimension reduction stage using Principal Component Analysis (PCA). Most previous LDA extensions are computationally expensive, and not scalable, due to the use of Singular Value Decomposition or Generalized Singular Value Decomposition. In this paper, we propose a two-stage LDA method, namely LDA/QR, which aims to overcome the singularity problems of classical LDA, while achieving efficiency and scalability simultaneously. The key difference between LDA/QR and PCA+LDA lies in the first stage, where LDA/QR applies QR decomposition to a small matrix involving the class centroids, while PCA+LDA applies PCA to the total scatter matrix involving all training data points. We further justify the proposed algorithm by showing the relationship among LDA/QR and previous LDA methods. Extensive experiments on face images and text documents are presented to show the effectiveness of the proposed algorithm.

  2. Spectral properties of thermal fluctuations on simple liquid surfaces below shot-noise levels.

    PubMed

    Aoki, Kenichiro; Mitsui, Takahisa

    2012-07-01

    We study the spectral properties of thermal fluctuations on simple liquid surfaces, sometimes called ripplons. Analytical properties of the spectral function are investigated and are shown to be composed of regions with simple analytic behavior with respect to the frequency or the wave number. The derived expressions are compared to spectral measurements performed orders of magnitude below shot-noise levels, which is achieved using a novel noise reduction method. The agreement between the theory of thermal surface fluctuations and the experiment is found to be excellent, elucidating the spectral properties of the surface fluctuations. The measurement method requires relatively only a small sample both spatially (few μm) and temporally (~20 s). The method also requires relatively weak light power (~0.5 mW) so that it has a broad range of applicability, including local measurements, investigations of time-dependent phenomena, and noninvasive measurements.

  3. Cox-nnet: An artificial neural network method for prognosis prediction of high-throughput omics data.

    PubMed

    Ching, Travers; Zhu, Xun; Garmire, Lana X

    2018-04-01

    Artificial neural networks (ANN) are computing architectures with many interconnections of simple neural-inspired computing elements, and have been applied to biomedical fields such as imaging analysis and diagnosis. We have developed a new ANN framework called Cox-nnet to predict patient prognosis from high throughput transcriptomics data. In 10 TCGA RNA-Seq data sets, Cox-nnet achieves the same or better predictive accuracy compared to other methods, including Cox-proportional hazards regression (with LASSO, ridge, and mimimax concave penalty), Random Forests Survival and CoxBoost. Cox-nnet also reveals richer biological information, at both the pathway and gene levels. The outputs from the hidden layer node provide an alternative approach for survival-sensitive dimension reduction. In summary, we have developed a new method for accurate and efficient prognosis prediction on high throughput data, with functional biological insights. The source code is freely available at https://github.com/lanagarmire/cox-nnet.

  4. Declining trends in injuries and ambulance calls for road traffic crashes in Bahrain post new traffic laws of 2015.

    PubMed

    Awadhalla, Muyssar Sabri; Asokan, Govindaraj Vaithinathan; Matooq, Amina; Kirubakaran, Richard

    2016-06-01

    Road traffic crashes (RTC) are of serious global health concern. To identify whether the number of ambulance calls, injuries, and deaths has declined after the implementation of the new traffic law (NTL) 2015 in Bahrain, de-identified administrative RTC data obtained from the tertiary care center, and the General Directorate of Traffic (GDT) of Bahrain were used. A quasi-experimental design was employed to trend the impact of the NTL on RTC and associated healthcare events. Bahrainis and non-Bahrainis who met with RTC, either in a vehicle or as a pedestrian, between February 8 and May 8 in 2013, 2014 (pre NTL), and 2015 (post NTL) were included in the study. Our results show a reduction in the number of ambulance calls from vehicular and pedestrian RTC victims. The ambulance calls from pedestrian RTC victims were <10% compared to the number of ambulance calls from vehicular RTC victims. There was a significant reduction in minor injuries post 2015, whereas no obvious difference was seen for serious injuries and deaths. A longer follow-up study to confirm the sustained decline in RTC, enforcing a zero tolerance policy toward traffic transgressions, and raising public awareness on the "critical four minutes" and "golden hour" is recommended. Copyright © 2016 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  5. A Corresponding Lie Algebra of a Reductive homogeneous Group and Its Applications

    NASA Astrophysics Data System (ADS)

    Zhang, Yu-Feng; Wu, Li-Xin; Rui, Wen-Juan

    2015-05-01

    With the help of a Lie algebra of a reductive homogeneous space G/K, where G is a Lie group and K is a resulting isotropy group, we introduce a Lax pair for which an expanding (2+1)-dimensional integrable hierarchy is obtained by applying the binormial-residue representation (BRR) method, whose Hamiltonian structure is derived from the trace identity for deducing (2+1)-dimensional integrable hierarchies, which was proposed by Tu, et al. We further consider some reductions of the expanding integrable hierarchy obtained in the paper. The first reduction is just right the (2+1)-dimensional AKNS hierarchy, the second-type reduction reveals an integrable coupling of the (2+1)-dimensional AKNS equation (also called the Davey-Stewartson hierarchy), a kind of (2+1)-dimensional Schrödinger equation, which was once reobtained by Tu, Feng and Zhang. It is interesting that a new (2+1)-dimensional integrable nonlinear coupled equation is generated from the reduction of the part of the (2+1)-dimensional integrable coupling, which is further reduced to the standard (2+1)-dimensional diffusion equation along with a parameter. In addition, the well-known (1+1)-dimensional AKNS hierarchy, the (1+1)-dimensional nonlinear Schrödinger equation are all special cases of the (2+1)-dimensional expanding integrable hierarchy. Finally, we discuss a few discrete difference equations of the diffusion equation whose stabilities are analyzed by making use of the von Neumann condition and the Fourier method. Some numerical solutions of a special stationary initial value problem of the (2+1)-dimensional diffusion equation are obtained and the resulting convergence and estimation formula are investigated. Supported by the Innovation Team of Jiangsu Province hosted by China University of Mining and Technology (2014), the National Natural Science Foundation of China under Grant No. 11371361, the Fundamental Research Funds for the Central Universities (2013XK03), and the Natural Science Foundation of Shandong Province under Grant No. ZR2013AL016

  6. FEAMAC/CARES Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composites

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu; Bhatt, Ramakrishna

    2016-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  7. Stochastic-Strength-Based Damage Simulation Tool for Ceramic Matrix Composite

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Bednarcyk, Brett; Pineda, Evan; Arnold, Steven; Mital, Subodh; Murthy, Pappu

    2015-01-01

    Reported here is a coupling of two NASA developed codes: CARES (Ceramics Analysis and Reliability Evaluation of Structures) with the MAC/GMC (Micromechanics Analysis Code/ Generalized Method of Cells) composite material analysis code. The resulting code is called FEAMAC/CARES and is constructed as an Abaqus finite element analysis UMAT (user defined material). Here we describe the FEAMAC/CARES code and an example problem (taken from the open literature) of a laminated CMC in off-axis loading is shown. FEAMAC/CARES performs stochastic-strength-based damage simulation response of a CMC under multiaxial loading using elastic stiffness reduction of the failed elements.

  8. Reducing the Carbon Footprint of the USDA-ARS "Soils" Lab in Morris, Minnesota

    USDA-ARS?s Scientific Manuscript database

    The Soils Lab in Morris adopted energy goals originally set forth by Executive Order 13423, which in short, called for a 30% reduction in energy use in federal facilities by 2015 and a 16% reduction in water use in the same time frame. Executive Order 13514 "Federal Leadership in Environmental, Ener...

  9. 76 FR 78924 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-20

    ... Budget (OMB) in compliance with the Paperwork Reduction Act (44 U.S.C. chapter 35). To request a copy of these requests, call the CDC Reports Clearance Officer at (404) 639-5960 or send an email to [email protected] National Center for HIV/AIDS, Viral Hepatitis, STD, and TB Prevention (NCHHSTP), Centers for Disease...

  10. 77 FR 17063 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-23

    ... Budget (OMB) in compliance with the Paperwork Reduction Act (44 U.S.C. Chapter 35). To request a copy of these requests, call the CDC Reports Clearance Officer at (404) 639-7570 or send an email to [email protected] HIV, Hepatitis, STD and TB Prevention (NCHHSTP), Centers for Disease Control and Prevention (CDC...

  11. Public Constructs of Energy Values and Behaviors in Implementing Taiwan's "Energy-Conservation/Carbon-Reduction" Declarations

    ERIC Educational Resources Information Center

    Chiu, Mei-Shiu; Yeh, Huei-Ming; Spangler, Jonathan

    2016-01-01

    The emergent crisis of global warming calls for energy education for people of all ages and social groups. The Taiwanese government has publicized 10 declarations on energy conservation and carbon reduction as public behavior guidelines to mitigate global warming. This study uses interviews with quantitative assessment to explore the values and…

  12. Compressed Speech: Potential Application for Air Force Technical Training. Final Report, August 73-November 73.

    ERIC Educational Resources Information Center

    Dailey, K. Anne

    Time-compressed speech (also called compressed speech, speeded speech, or accelerated speech) is an extension of the normal recording procedure for reproducing the spoken word. Compressed speech can be used to achieve dramatic reductions in listening time without significant loss in comprehension. The implications of such temporal reductions in…

  13. Calibrating passive acoustic monitoring: correcting humpback whale call detections for site-specific and time-dependent environmental characteristics.

    PubMed

    Helble, Tyler A; D'Spain, Gerald L; Campbell, Greg S; Hildebrand, John A

    2013-11-01

    This paper demonstrates the importance of accounting for environmental effects on passive underwater acoustic monitoring results. The situation considered is the reduction in shipping off the California coast between 2008-2010 due to the recession and environmental legislation. The resulting variations in ocean noise change the probability of detecting marine mammal vocalizations. An acoustic model was used to calculate the time-varying probability of detecting humpback whale vocalizations under best-guess environmental conditions and varying noise. The uncorrected call counts suggest a diel pattern and an increase in calling over a two-year period; the corrected call counts show minimal evidence of these features.

  14. Multiresolution generalized N dimension PCA for ultrasound image denoising

    PubMed Central

    2014-01-01

    Background Ultrasound images are usually affected by speckle noise, which is a type of random multiplicative noise. Thus, reducing speckle and improving image visual quality are vital to obtaining better diagnosis. Method In this paper, a novel noise reduction method for medical ultrasound images, called multiresolution generalized N dimension PCA (MR-GND-PCA), is presented. In this method, the Gaussian pyramid and multiscale image stacks on each level are built first. GND-PCA as a multilinear subspace learning method is used for denoising. Each level is combined to achieve the final denoised image based on Laplacian pyramids. Results The proposed method is tested with synthetically speckled and real ultrasound images, and quality evaluation metrics, including MSE, SNR and PSNR, are used to evaluate its performance. Conclusion Experimental results show that the proposed method achieved the lowest noise interference and improved image quality by reducing noise and preserving the structure. Our method is also robust for the image with a much higher level of speckle noise. For clinical images, the results show that MR-GND-PCA can reduce speckle and preserve resolvable details. PMID:25096917

  15. Beluga whale, Delphinapterus leucas, vocalizations from the Churchill River, Manitoba, Canada.

    PubMed

    Chmelnitsky, Elly G; Ferguson, Steven H

    2012-06-01

    Classification of animal vocalizations is often done by a human observer using aural and visual analysis but more efficient, automated methods have also been utilized to reduce bias and increase reproducibility. Beluga whale, Delphinapterus leucas, calls were described from recordings collected in the summers of 2006-2008, in the Churchill River, Manitoba. Calls (n=706) were classified based on aural and visual analysis, and call characteristics were measured; calls were separated into 453 whistles (64.2%; 22 types), 183 pulsed∕noisy calls (25.9%; 15 types), and 70 combined calls (9.9%; seven types). Measured parameters varied within each call type but less variation existed in pulsed and noisy call types and some combined call types than in whistles. A more efficient and repeatable hierarchical clustering method was applied to 200 randomly chosen whistles using six call characteristics as variables; twelve groups were identified. Call characteristics varied less in cluster analysis groups than in whistle types described by visual and aural analysis and results were similar to the whistle contours described. This study provided the first description of beluga calls in Hudson Bay and using two methods provides more robust interpretations and an assessment of appropriate methods for future studies.

  16. Influence of a New “Call-Out Algorithm” for Management of Postoperative Pain and Its Side Effects on Length of Stay in Hospital: A Two-Centre Prospective Randomized Trial

    PubMed Central

    Dybvik, Lisa; Skraastad, Erlend; Yeltayeva, Aigerim; Konkayev, Aidos; Musaeva, Tatiana; Zabolotskikh, Igor; Dahl, Vegard; Raeder, Johan

    2017-01-01

    Background We recently introduced the efficacy safety score (ESS) as a new “call-out algorithm” for management of postoperative pain and side effects. In this study, we report the influence of ESS recorded hourly during the first 8 hours after surgery on the mobility degree, postoperative nonsurgical complications, and length of hospital stay (LOS). Methods We randomized 1152 surgical patients into three groups for postoperative observation: (1) ESS group (n = 409), (2) Verbal Numeric Rate Scale (VNRS) for pain group (n = 417), and (3) an ordinary qualitative observation (Control) group (n = 326). An ESS > 10 or VNRS > 4 at rest or a nurse's observation of pain or adverse reaction to analgesic treatment in the Control group served as a “call-out alarm” for an anaesthesiologist. Results We found no significant differences in the mobility degree and number of postoperative nonsurgical complications between the groups. LOS was significantly shorter with 12.7 ± 6.3 days (mean ± SD) in the ESS group versus 14.2 ± 6.2 days in the Control group (P < 0.001). Conclusion Postoperative ESS recording in combination with the possibility to call upon an anaesthesiologist when exceeding the threshold score might have contributed to the reductions of LOS in this two-centre study. This trial is registered with NCT02143128. PMID:28855800

  17. Multilinear Graph Embedding: Representation and Regularization for Images.

    PubMed

    Chen, Yi-Lei; Hsu, Chiou-Ting

    2014-02-01

    Given a set of images, finding a compact and discriminative representation is still a big challenge especially when multiple latent factors are hidden in the way of data generation. To represent multifactor images, although multilinear models are widely used to parameterize the data, most methods are based on high-order singular value decomposition (HOSVD), which preserves global statistics but interprets local variations inadequately. To this end, we propose a novel method, called multilinear graph embedding (MGE), as well as its kernelization MKGE to leverage the manifold learning techniques into multilinear models. Our method theoretically links the linear, nonlinear, and multilinear dimensionality reduction. We also show that the supervised MGE encodes informative image priors for image regularization, provided that an image is represented as a high-order tensor. From our experiments on face and gait recognition, the superior performance demonstrates that MGE better represents multifactor images than classic methods, including HOSVD and its variants. In addition, the significant improvement in image (or tensor) completion validates the potential of MGE for image regularization.

  18. Automatic abdominal lymph node detection method based on local intensity structure analysis from 3D x-ray CT images

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshihiko; Nimura, Yukitaka; Kitasaka, Takayuki; Mizuno, Shinji; Furukawa, Kazuhiro; Goto, Hidemi; Fujiwara, Michitaka; Misawa, Kazunari; Ito, Masaaki; Nawano, Shigeru; Mori, Kensaku

    2013-03-01

    This paper presents an automated method of abdominal lymph node detection to aid the preoperative diagnosis of abdominal cancer surgery. In abdominal cancer surgery, surgeons must resect not only tumors and metastases but also lymph nodes that might have a metastasis. This procedure is called lymphadenectomy or lymph node dissection. Insufficient lymphadenectomy carries a high risk for relapse. However, excessive resection decreases a patient's quality of life. Therefore, it is important to identify the location and the structure of lymph nodes to make a suitable surgical plan. The proposed method consists of candidate lymph node detection and false positive reduction. Candidate lymph nodes are detected using a multi-scale blob-like enhancement filter based on local intensity structure analysis. To reduce false positives, the proposed method uses a classifier based on support vector machine with the texture and shape information. The experimental results reveal that it detects 70.5% of the lymph nodes with 13.0 false positives per case.

  19. 14C sample preparation for AMS microdosing studies at Lund University using online combustion and septa-sealed vials

    NASA Astrophysics Data System (ADS)

    Sydoff, Marie; Stenström, Kristina

    2010-04-01

    The Department of Physics at Lund University is participating in a European Union project called EUMAPP (European Union Microdose AMS Partnership Programme), in which sample preparation and accelerator mass spectrometry (AMS) measurements of biological samples from microdosing studies have been made. This paper describes a simplified method of converting biological samples to solid graphite for 14C analysis with AMS. The method is based on online combustion of the samples, and reduction of CO 2 in septa-sealed vials. The septa-sealed vials and disposable materials are used to eliminate sample cross-contamination. Measurements of ANU and Ox I standards show deviations of 2% and 3%, respectively, relative to reference values. This level of accuracy is sufficient for biological samples from microdosing studies. Since the method has very few handling steps from sample to graphite, the risk of failure during the sample preparation process is minimized, making the method easy to use in routine preparation of samples.

  20. Efficient experimental design for uncertainty reduction in gene regulatory networks.

    PubMed

    Dehghannasiri, Roozbeh; Yoon, Byung-Jun; Dougherty, Edward R

    2015-01-01

    An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/.

  1. Efficient experimental design for uncertainty reduction in gene regulatory networks

    PubMed Central

    2015-01-01

    Background An accurate understanding of interactions among genes plays a major role in developing therapeutic intervention methods. Gene regulatory networks often contain a significant amount of uncertainty. The process of prioritizing biological experiments to reduce the uncertainty of gene regulatory networks is called experimental design. Under such a strategy, the experiments with high priority are suggested to be conducted first. Results The authors have already proposed an optimal experimental design method based upon the objective for modeling gene regulatory networks, such as deriving therapeutic interventions. The experimental design method utilizes the concept of mean objective cost of uncertainty (MOCU). MOCU quantifies the expected increase of cost resulting from uncertainty. The optimal experiment to be conducted first is the one which leads to the minimum expected remaining MOCU subsequent to the experiment. In the process, one must find the optimal intervention for every gene regulatory network compatible with the prior knowledge, which can be prohibitively expensive when the size of the network is large. In this paper, we propose a computationally efficient experimental design method. This method incorporates a network reduction scheme by introducing a novel cost function that takes into account the disruption in the ranking of potential experiments. We then estimate the approximate expected remaining MOCU at a lower computational cost using the reduced networks. Conclusions Simulation results based on synthetic and real gene regulatory networks show that the proposed approximate method has close performance to that of the optimal method but at lower computational cost. The proposed approximate method also outperforms the random selection policy significantly. A MATLAB software implementing the proposed experimental design method is available at http://gsp.tamu.edu/Publications/supplementary/roozbeh15a/. PMID:26423515

  2. Research supporting potential modification of the NASA specification for dry heat microbial reduction of spacecraft hardware

    NASA Astrophysics Data System (ADS)

    Spry, James A.; Beaudet, Robert; Schubert, Wayne

    Dry heat microbial reduction (DHMR) is the primary method currently used to reduce the microbial load of spacecraft and component parts to comply with planetary protection re-quirements. However, manufacturing processes often involve heating flight hardware to high temperatures for purposes other than planetary protection DHMR. At present, the specifica-tion in NASA document NPR8020.12, describing the process lethality on B. atrophaeus (ATCC 9372) bacterial spores, does not allow for additional planetary protection bioburden reduction credit for processing outside a narrow temperature, time and humidity window. Our results from a comprehensive multi-year laboratory research effort have generated en-hanced data sets on four aspects of the current specification: time and temperature effects in combination, the effect that humidity has on spore lethality, and the lethality for spores with exceptionally high thermal resistance (so called "hardies"). This paper describes potential modifications to the specification, based on the data set gener-ated in the referenced studies. The proposed modifications are intended to broaden the scope of the current specification while still maintaining confidence in a conservative interpretation of the lethality of the DHMR process on microorganisms.

  3. Adiabatic reduction of a model of stochastic gene expression with jump Markov process.

    PubMed

    Yvinec, Romain; Zhuge, Changjing; Lei, Jinzhi; Mackey, Michael C

    2014-04-01

    This paper considers adiabatic reduction in a model of stochastic gene expression with bursting transcription considered as a jump Markov process. In this model, the process of gene expression with auto-regulation is described by fast/slow dynamics. The production of mRNA is assumed to follow a compound Poisson process occurring at a rate depending on protein levels (the phenomena called bursting in molecular biology) and the production of protein is a linear function of mRNA numbers. When the dynamics of mRNA is assumed to be a fast process (due to faster mRNA degradation than that of protein) we prove that, with appropriate scalings in the burst rate, jump size or translational rate, the bursting phenomena can be transmitted to the slow variable. We show that, depending on the scaling, the reduced equation is either a stochastic differential equation with a jump Poisson process or a deterministic ordinary differential equation. These results are significant because adiabatic reduction techniques seem to have not been rigorously justified for a stochastic differential system containing a jump Markov process. We expect that the results can be generalized to adiabatic methods in more general stochastic hybrid systems.

  4. Aircraft interior noise reduction by alternate resonance tuning

    NASA Technical Reports Server (NTRS)

    Gottwald, James A.; Bliss, Donald B.

    1990-01-01

    The focus is on a noise control method which considers aircraft fuselages lined with panels alternately tuned to frequencies above and below the frequency that must be attenuated. An interior noise reduction called alternate resonance tuning (ART) is described both theoretically and experimentally. Problems dealing with tuning single paneled wall structures for optimum noise reduction using the ART methodology are presented, and three theoretical problems are analyzed. The first analysis is a three dimensional, full acoustic solution for tuning a panel wall composed of repeating sections with four different panel tunings within that section, where the panels are modeled as idealized spring-mass-damper systems. The second analysis is a two dimensional, full acoustic solution for a panel geometry influenced by the effect of a propagating external pressure field such as that which might be associated with propeller passage by a fuselage. To reduce the analysis complexity, idealized spring-mass-damper panels are again employed. The final theoretical analysis presents the general four panel problem with real panel sections, where the effect of higher structural modes is discussed. Results from an experimental program highlight real applications of the ART concept and show the effectiveness of the tuning on real structures.

  5. 77 FR 6802 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... AIDS cases reported to the CDC. Among Latino males, male-to-male sexual contact is the single most... Budget (OMB) in compliance with the Paperwork Reduction Act (44 U.S.C. Chapter 35). To request a copy of these requests, call the CDC Reports Clearance Officer at (404) 639-7570 or send an email to [email protected

  6. Combination of BTrackS and Geri-Fit as a targeted approach for assessing and reducing the postural sway of older adults with high fall risk

    PubMed Central

    Goble, Daniel J; Hearn, Mason C; Baweja, Harsimran S

    2017-01-01

    Atypically high postural sway measured by a force plate is a known risk factor for falls in older adults. Further, it has been shown that small, but significant, reductions in postural sway are possible with various balance exercise interventions. In the present study, a new low-cost force-plate technology called the Balance Tracking System (BTrackS) was utilized to assess postural sway of older adults before and after 90 days of a well-established exercise program called Geri-Fit. Results showed an overall reduction in postural sway across all participants from pre- to post-intervention. However, the magnitude of effects was significantly influenced by the amount of postural sway demonstrated by individuals prior to Geri-Fit training. Specifically, more participants with atypically high postural sway pre-intervention experienced an overall postural sway reduction. These reductions experienced were typically greater than the minimum detectable change statistic for the BTrackS Balance Test. Taken together, these findings suggest that BTrackS is an effective means of identifying older adults with elevated postural sway, who are likely to benefit from Geri-Fit training to mitigate fall risk. PMID:28228655

  7. Combination of BTrackS and Geri-Fit as a targeted approach for assessing and reducing the postural sway of older adults with high fall risk.

    PubMed

    Goble, Daniel J; Hearn, Mason C; Baweja, Harsimran S

    2017-01-01

    Atypically high postural sway measured by a force plate is a known risk factor for falls in older adults. Further, it has been shown that small, but significant, reductions in postural sway are possible with various balance exercise interventions. In the present study, a new low-cost force-plate technology called the Balance Tracking System (BTrackS) was utilized to assess postural sway of older adults before and after 90 days of a well-established exercise program called Geri-Fit. Results showed an overall reduction in postural sway across all participants from pre- to post-intervention. However, the magnitude of effects was significantly influenced by the amount of postural sway demonstrated by individuals prior to Geri-Fit training. Specifically, more participants with atypically high postural sway pre-intervention experienced an overall postural sway reduction. These reductions experienced were typically greater than the minimum detectable change statistic for the BTrackS Balance Test. Taken together, these findings suggest that BTrackS is an effective means of identifying older adults with elevated postural sway, who are likely to benefit from Geri-Fit training to mitigate fall risk.

  8. Improving Upon String Methods for Transition State Discovery.

    PubMed

    Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker

    2012-02-14

    Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.

  9. The impact of patient record access on appointments and telephone calls in two English general practices: a population-based study

    PubMed Central

    Fitton, Caroline; Fitton, Richard; Hannan, Amir; Morgan, Lawrie; Halsall, David

    2014-01-01

    Background Government policy expects all patients who wish to have online record access (RA) by 2015. We currently have no knowledge of the impact of patient record access on practice workload. Setting Two urban general practices in Manchester. Question What is the impact of patient RA on telephone calls and appointments in UK general practice? Method We asked patients in two urban general practices who used RA whether it had increased or decreased their use of the practice over the previous year. Using practice data, we calculated the change in appointments, telephone calls and staff cost. We also estimated the reduction in environmental costs and patient time. Results An average of 187 clinical appointments (of which 87 were with doctors and 45 with nurses) and 290 telephone calls were saved. If 30% of patients used RA at least twice a year, these figures suggest that a 10 000-patient practice would save 4747 appointments and 8020 telephone calls per year. Assuming a consultation rate of 5.3% annually, that equates to a release of about 11% of appointments per year, with significant resource savings for patients and the environment. Discussion This is the first such study in the UK. It shows similar results to a study in the USA. We discuss the study limitations, including the issue of patient recall, nature of the practices studied and nature of early adopter patients. Strengths include combining national data, practice data and local reflection. We are confident that the savings observed are the result of RA rather than other factors. We suggest that RA can be part of continuous practice improvement, given its benefits and the support it offers for patient confidence, self-care and shared decision-making. PMID:25949705

  10. Deterrence and WMD Terrorism: Calibrating Its Potential Contributions to Risk Reduction

    DTIC Science & Technology

    2007-06-01

    ideology and aspiration (so-called franchisees ) • operational enablers (financiers etc.) • moral legitimizers • state sponsors • passive state...of al Qaeda • groups affiliated by ideology and aspiration (so-called franchisees ) • operational enablers (financiers etc.) • moral legitimizers...of deterrence.14 One is “deterrence by the threat of punishment,” which compels the adversary to try to calculate whether the potential benefits of

  11. A morphological filter for removing 'Cirrus-like' emission from far-infrared extragalactic IRAS fields

    NASA Technical Reports Server (NTRS)

    Appleton, P. N.; Siqueira, P. R.; Basart, J. P.

    1993-01-01

    The presence of diffuse extended IR emission from the Galaxy in the form of the so called 'Galactic Cirrus' emission has hampered the exploration of the extragalactic sky at long IR wavelengths. We describe the development of a filter based on mathematical morphology which appears to be a promising approach to the problem of cirrus removal. The method of Greyscale Morphology was applied to a 100 micron IRAS image of the M81 group of galaxies. This is an extragalactic field which suffers from serious contamination from foreground Galactic 'cirrus'. Using a technique called 'sieving', it was found that the cirrus emission has a characteristic behavior which can be quantified in terms of an average spatial structure spectrum or growth function. This function was then used to attempt to remove 'cirrus' from the entire image. The result was a significant reduction of cirrus emission by an intensity factor of 15 compared with the original input image. The method appears to preserve extended emission in the spatially extended IR disks of M81 and M82 as well as distinguishing fainter galaxies within bright regions of galactic cirrus. The techniques may also be applicable to IR databases obtained with the Cosmic Background Explorer.

  12. [Effects of 12 sessions of high intensity intermittent training and nutrition counseling on body fat in obese and overweight participants].

    PubMed

    Molina, Catalina; Cifuentes, Gabriela; Martínez, Cristian; Mancilla, Rodrigo; Díaz, Erik

    2016-10-01

    The search of efficient exercise alternatives to treat obesity is worthwhile. To demonstrate the effect of high intensity intermittent exercise on body fat reduction in overweight and obese subjects. A group of 65 overweight and obese adult subjects (25 men), aged 18-65 years, participated during 12 sessions in a high intensity physical exercise program, 3 days/week. Weight, height and body fat was measured before and after the intervention by bioimpedance. Each session consisted of 1 min stationary cycling exercise at high intensity, followed by 2 min inactive rest. This cycle was repeated 10 times, thus the method is called 1*2*10. There was a significant reduction of body fat of -1.88 ± 2.8 and -3.44 ± 2.7 kg, in women and men, respectively (p < 0.05). The 1*2*10 training protocol lasting 12 weeks in association with nutrition counseling is effective in reducing body fat in overweight persons.

  13. A combined drama-based and CBT approach to working with self-reported anger aggression.

    PubMed

    Blacker, Janine; Watson, Andy; Beech, Anthony R

    2008-01-01

    A drama-based programme, called 'Insult to Injury', was designed to explore the processes of anger, aggression and violence. The aim of the programme was to enable offenders to identify and generate strategies and skills for dealing with potentially volatile situations, and to provide a safe and supportive environment in which to practice and evaluate these strategies. AIMS An active drama-based approach combined with cognitive-behavioural techniques was used to explore issues such as masculinity, power and control, pride and shame and victim awareness. Reductions in anger were hypothesized. METHOD A single group pre/post design assessed the levels of anger before and after the course. RESULTS Sixty-two adult male offenders from six prison establishments in the UK took part in the nine-day course. As hypothesised, significant reductions in anger were found in pre- to post-course assessment. These results suggest that a drama-based approach may be a promising adjunct to traditional anger management programmes for violent offenders.

  14. Self-enhancement learning: target-creating learning and its application to self-organizing maps.

    PubMed

    Kamimura, Ryotaro

    2011-05-01

    In this article, we propose a new learning method called "self-enhancement learning." In this method, targets for learning are not given from the outside, but they can be spontaneously created within a neural network. To realize the method, we consider a neural network with two different states, namely, an enhanced and a relaxed state. The enhanced state is one in which the network responds very selectively to input patterns, while in the relaxed state, the network responds almost equally to input patterns. The gap between the two states can be reduced by minimizing the Kullback-Leibler divergence between the two states with free energy. To demonstrate the effectiveness of this method, we applied self-enhancement learning to the self-organizing maps, or SOM, in which lateral interactions were added to an enhanced state. We applied the method to the well-known Iris, wine, housing and cancer machine learning database problems. In addition, we applied the method to real-life data, a student survey. Experimental results showed that the U-matrices obtained were similar to those produced by the conventional SOM. Class boundaries were made clearer in the housing and cancer data. For all the data, except for the cancer data, better performance could be obtained in terms of quantitative and topological errors. In addition, we could see that the trustworthiness and continuity, referring to the quality of neighborhood preservation, could be improved by the self-enhancement learning. Finally, we used modern dimensionality reduction methods and compared their results with those obtained by the self-enhancement learning. The results obtained by the self-enhancement were not superior to but comparable with those obtained by the modern dimensionality reduction methods.

  15. Validation of no-reference image quality index for the assessment of digital mammographic images

    NASA Astrophysics Data System (ADS)

    de Oliveira, Helder C. R.; Barufaldi, Bruno; Borges, Lucas R.; Gabarda, Salvador; Bakic, Predrag R.; Maidment, Andrew D. A.; Schiabel, Homero; Vieira, Marcelo A. C.

    2016-03-01

    To ensure optimal clinical performance of digital mammography, it is necessary to obtain images with high spatial resolution and low noise, keeping radiation exposure as low as possible. These requirements directly affect the interpretation of radiologists. The quality of a digital image should be assessed using objective measurements. In general, these methods measure the similarity between a degraded image and an ideal image without degradation (ground-truth), used as a reference. These methods are called Full-Reference Image Quality Assessment (FR-IQA). However, for digital mammography, an image without degradation is not available in clinical practice; thus, an objective method to assess the quality of mammograms must be performed without reference. The purpose of this study is to present a Normalized Anisotropic Quality Index (NAQI), based on the Rényi entropy in the pseudo-Wigner domain, to assess mammography images in terms of spatial resolution and noise without any reference. The method was validated using synthetic images acquired through an anthropomorphic breast software phantom, and the clinical exposures on anthropomorphic breast physical phantoms and patient's mammograms. The results reported by this noreference index follow the same behavior as other well-established full-reference metrics, e.g., the peak signal-to-noise ratio (PSNR) and structural similarity index (SSIM). Reductions of 50% on the radiation dose in phantom images were translated as a decrease of 4dB on the PSNR, 25% on the SSIM and 33% on the NAQI, evidencing that the proposed metric is sensitive to the noise resulted from dose reduction. The clinical results showed that images reduced to 53% and 30% of the standard radiation dose reported reductions of 15% and 25% on the NAQI, respectively. Thus, this index may be used in clinical practice as an image quality indicator to improve the quality assurance programs in mammography; hence, the proposed method reduces the subjectivity inter-observers in the reporting of image quality assessment.

  16. New computational methods reveal tRNA identity element divergence between Proteobacteria and Cyanobacteria.

    PubMed

    Freyhult, Eva; Cui, Yuanyuan; Nilsson, Olle; Ardell, David H

    2007-10-01

    There are at least 21 subfunctional classes of tRNAs in most cells that, despite a very highly conserved and compact common structure, must interact specifically with different cliques of proteins or cause grave organismal consequences. Protein recognition of specific tRNA substrates is achieved in part through class-restricted tRNA features called tRNA identity determinants. In earlier work we used TFAM, a statistical classifier of tRNA function, to show evidence of unexpectedly large diversity among bacteria in tRNA identity determinants. We also created a data reduction technique called function logos to visualize identity determinants for a given taxon. Here we show evidence that determinants for lysylated isoleucine tRNAs are not the same in Proteobacteria as in other bacterial groups including the Cyanobacteria. Consistent with this, the lysylating biosynthetic enzyme TilS lacks a C-terminal domain in Cyanobacteria that is present in Proteobacteria. We present here, using function logos, a map estimating all potential identity determinants generally operational in Cyanobacteria and Proteobacteria. To further isolate the differences in potential tRNA identity determinants between Proteobacteria and Cyanobacteria, we created two new data reduction visualizations to contrast sequence and function logos between two taxa. One, called Information Difference logos (ID logos), shows the evolutionary gain or retention of functional information associated to features in one lineage. The other, Kullback-Leibler divergence Difference logos (KLD logos), shows recruitments or shifts in the functional associations of features, especially those informative in both lineages. We used these new logos to specifically isolate and visualize the differences in potential tRNA identity determinants between Proteobacteria and Cyanobacteria. Our graphical results point to numerous differences in potential tRNA identity determinants between these groups. Although more differences in general are explained by shifts in functional association rather than gains or losses, the apparent identity differences in lysylated isoleucine tRNAs appear to have evolved through both mechanisms.

  17. Patient Navigators Connecting Patients to Community Resources to Improve Diabetes Outcomes.

    PubMed

    Loskutova, Natalia Y; Tsai, Adam G; Fisher, Edwin B; LaCruz, Debby M; Cherrington, Andrea L; Harrington, T Michael; Turner, Tamela J; Pace, Wilson D

    2016-01-01

    Despite the recognized importance of lifestyle modification in reducing risk of developing type 2 diabetes and in diabetes management, the use of available community resources by both patients and their primary care providers (PCPs) remains low. The patient navigator model, widely used in cancer care, may have the potential to link PCPs and community resources for reduction of risk and control of type 2 diabetes. In this study we tested the feasibility and acceptability of telephone-based nonprofessional patient navigation to promote linkages between the PCP office and community programs for patients with or at risk for diabetes. This was a mixed-methods interventional prospective cohort study conducted between November 2012 and August 2013. We included adult patients with and at risk for type 2 diabetes from six primary care practices. Patient-level measures of glycemic control, diabetes care, and self-efficacy from medical records, and qualitative interview data on acceptability and feasibility, were used. A total of 179 patients participated in the study. Two patient navigators provided services over the phone, using motivational interviewing techniques. Patient navigators provided regular feedback to PCPs and followed up with the patients through phone calls. The patient navigators made 1028 calls, with an average of 6 calls per patient. At follow-up, reduction in HbA1c (7.8 ± 1.9% vs 7.2 ± 1.3%; P = .001) and improvement in patient self-efficacy (3.1 ± 0.8 vs 3.6 ± 0.7; P < .001) were observed. Qualitative analysis revealed uniformly positive feedback from providers and patients. The patient navigator model is a promising and acceptable strategy to link patient, PCP, and community resources for promoting lifestyle modification in people living with or at risk for type 2 diabetes. © Copyright 2016 by the American Board of Family Medicine.

  18. Composite Material Testing Data Reduction to Adjust for the Systematic 6-DOF Testing Machine Aberrations

    Treesearch

    Athanasios lliopoulos; John G. Michopoulos; John G. C. Hermanson

    2012-01-01

    This paper describes a data reduction methodology for eliminating the systematic aberrations introduced by the unwanted behavior of a multiaxial testing machine, into the massive amounts of experimental data collected from testing of composite material coupons. The machine in reference is a custom made 6-DoF system called NRL66.3 and developed at the NAval...

  19. Latin America Report.

    DTIC Science & Technology

    1985-08-26

    Immediate reduction of central government spending by 72 million quetzals from the outlays called for in the 1985 budget, with the least possible effect...on the investment budget. The measures recommended by the Public Finance Ministry should be implemented, and a ceiling of 1,002,300,000 quetzals ...employment. 2. Reduction of operational spending of decentralized agencies in 1985, by not less than 28 million quetzals . 3. Creation of a Public

  20. Multiview Locally Linear Embedding for Effective Medical Image Retrieval

    PubMed Central

    Shen, Hualei; Tao, Dacheng; Ma, Dianfu

    2013-01-01

    Content-based medical image retrieval continues to gain attention for its potential to assist radiological image interpretation and decision making. Many approaches have been proposed to improve the performance of medical image retrieval system, among which visual features such as SIFT, LBP, and intensity histogram play a critical role. Typically, these features are concatenated into a long vector to represent medical images, and thus traditional dimension reduction techniques such as locally linear embedding (LLE), principal component analysis (PCA), or laplacian eigenmaps (LE) can be employed to reduce the “curse of dimensionality”. Though these approaches show promising performance for medical image retrieval, the feature-concatenating method ignores the fact that different features have distinct physical meanings. In this paper, we propose a new method called multiview locally linear embedding (MLLE) for medical image retrieval. Following the patch alignment framework, MLLE preserves the geometric structure of the local patch in each feature space according to the LLE criterion. To explore complementary properties among a range of features, MLLE assigns different weights to local patches from different feature spaces. Finally, MLLE employs global coordinate alignment and alternating optimization techniques to learn a smooth low-dimensional embedding from different features. To justify the effectiveness of MLLE for medical image retrieval, we compare it with conventional spectral embedding methods. We conduct experiments on a subset of the IRMA medical image data set. Evaluation results show that MLLE outperforms state-of-the-art dimension reduction methods. PMID:24349277

  1. Meta-modelling, visualization and emulation of multi-dimensional data for virtual production intelligence

    NASA Astrophysics Data System (ADS)

    Schulz, Wolfgang; Hermanns, Torsten; Al Khawli, Toufik

    2017-07-01

    Decision making for competitive production in high-wage countries is a daily challenge where rational and irrational methods are used. The design of decision making processes is an intriguing, discipline spanning science. However, there are gaps in understanding the impact of the known mathematical and procedural methods on the usage of rational choice theory. Following Benjamin Franklin's rule for decision making formulated in London 1772, he called "Prudential Algebra" with the meaning of prudential reasons, one of the major ingredients of Meta-Modelling can be identified finally leading to one algebraic value labelling the results (criteria settings) of alternative decisions (parameter settings). This work describes the advances in Meta-Modelling techniques applied to multi-dimensional and multi-criterial optimization by identifying the persistence level of the corresponding Morse-Smale Complex. Implementations for laser cutting and laser drilling are presented, including the generation of fast and frugal Meta-Models with controlled error based on mathematical model reduction Reduced Models are derived to avoid any unnecessary complexity. Both, model reduction and analysis of multi-dimensional parameter space are used to enable interactive communication between Discovery Finders and Invention Makers. Emulators and visualizations of a metamodel are introduced as components of Virtual Production Intelligence making applicable the methods of Scientific Design Thinking and getting the developer as well as the operator more skilled.

  2. redNumerical modelling of a peripheral arterial stenosis using dimensionally reduced models and kernel methods.

    PubMed

    Köppl, Tobias; Santin, Gabriele; Haasdonk, Bernard; Helmig, Rainer

    2018-05-06

    In this work, we consider two kinds of model reduction techniques to simulate blood flow through the largest systemic arteries, where a stenosis is located in a peripheral artery i.e. in an artery that is located far away from the heart. For our simulations we place the stenosis in one of the tibial arteries belonging to the right lower leg (right post tibial artery). The model reduction techniques that are used are on the one hand dimensionally reduced models (1-D and 0-D models, the so-called mixed-dimension model) and on the other hand surrogate models produced by kernel methods. Both methods are combined in such a way that the mixed-dimension models yield training data for the surrogate model, where the surrogate model is parametrised by the degree of narrowing of the peripheral stenosis. By means of a well-trained surrogate model, we show that simulation data can be reproduced with a satisfactory accuracy and that parameter optimisation or state estimation problems can be solved in a very efficient way. Furthermore it is demonstrated that a surrogate model enables us to present after a very short simulation time the impact of a varying degree of stenosis on blood flow, obtaining a speedup of several orders over the full model. This article is protected by copyright. All rights reserved.

  3. Collective phase description of oscillatory convection

    NASA Astrophysics Data System (ADS)

    Kawamura, Yoji; Nakao, Hiroya

    2013-12-01

    We formulate a theory for the collective phase description of oscillatory convection in Hele-Shaw cells. It enables us to describe the dynamics of the oscillatory convection by a single degree of freedom which we call the collective phase. The theory can be considered as a phase reduction method for limit-cycle solutions in infinite-dimensional dynamical systems, namely, stable time-periodic solutions to partial differential equations, representing the oscillatory convection. We derive the phase sensitivity function, which quantifies the phase response of the oscillatory convection to weak perturbations applied at each spatial point, and analyze the phase synchronization between two weakly coupled Hele-Shaw cells exhibiting oscillatory convection on the basis of the derived phase equations.

  4. High-Speed Jet Noise Reduction NASA Perspective

    NASA Technical Reports Server (NTRS)

    Huff, Dennis L.; Handy, J. (Technical Monitor)

    2001-01-01

    History shows that the problem of high-speed jet noise reduction is difficult to solve. the good news is that high performance military aircraft noise is dominated by a single source called 'jet noise' (commercial aircraft have several sources). The bad news is that this source has been the subject of research for the past 50 years and progress has been incremental. Major jet noise reduction has been achieved through changing the cycle of the engine to reduce the jet exit velocity. Smaller reductions have been achieved using suppression devices like mixing enhancement and acoustic liners. Significant jet noise reduction without any performance loss is probably not possible! Recent NASA Noise Reduction Research Programs include the High Speed Research Program, Advanced Subsonic Technology Noise Reduction Program, Aerospace Propulsion and Power Program - Fundamental Noise, and Quiet Aircraft Technology Program.

  5. Consensus building for interlaboratory studies, key comparisons, and meta-analysis

    NASA Astrophysics Data System (ADS)

    Koepke, Amanda; Lafarge, Thomas; Possolo, Antonio; Toman, Blaza

    2017-06-01

    Interlaboratory studies in measurement science, including key comparisons, and meta-analyses in several fields, including medicine, serve to intercompare measurement results obtained independently, and typically produce a consensus value for the common measurand that blends the values measured by the participants. Since interlaboratory studies and meta-analyses reveal and quantify differences between measured values, regardless of the underlying causes for such differences, they also provide so-called ‘top-down’ evaluations of measurement uncertainty. Measured values are often substantially over-dispersed by comparison with their individual, stated uncertainties, thus suggesting the existence of yet unrecognized sources of uncertainty (dark uncertainty). We contrast two different approaches to take dark uncertainty into account both in the computation of consensus values and in the evaluation of the associated uncertainty, which have traditionally been preferred by different scientific communities. One inflates the stated uncertainties by a multiplicative factor. The other adds laboratory-specific ‘effects’ to the value of the measurand. After distinguishing what we call recipe-based and model-based approaches to data reductions in interlaboratory studies, we state six guiding principles that should inform such reductions. These principles favor model-based approaches that expose and facilitate the critical assessment of validating assumptions, and give preeminence to substantive criteria to determine which measurement results to include, and which to exclude, as opposed to purely statistical considerations, and also how to weigh them. Following an overview of maximum likelihood methods, three general purpose procedures for data reduction are described in detail, including explanations of how the consensus value and degrees of equivalence are computed, and the associated uncertainty evaluated: the DerSimonian-Laird procedure; a hierarchical Bayesian procedure; and the Linear Pool. These three procedures have been implemented and made widely accessible in a Web-based application (NIST Consensus Builder). We illustrate principles, statistical models, and data reduction procedures in four examples: (i) the measurement of the Newtonian constant of gravitation; (ii) the measurement of the half-lives of radioactive isotopes of caesium and strontium; (iii) the comparison of two alternative treatments for carotid artery stenosis; and (iv) a key comparison where the measurand was the calibration factor of a radio-frequency power sensor.

  6. Comparison of theory and direct numerical simulations of drag reduction by rodlike polymers in turbulent channel flows.

    PubMed

    Benzi, Roberto; Ching, Emily S C; De Angelis, Elisabetta; Procaccia, Itamar

    2008-04-01

    Numerical simulations of turbulent channel flows, with or without additives, are limited in the extent of the Reynolds number (Re) and Deborah number (De). The comparison of such simulations to theories of drag reduction, which are usually derived for asymptotically high Re and De, calls for some care. In this paper we present a study of drag reduction by rodlike polymers in a turbulent channel flow using direct numerical simulation and illustrate how these numerical results should be related to the recently developed theory.

  7. Method to make accurate concentration and isotopic measurements for small gas samples

    NASA Astrophysics Data System (ADS)

    Palmer, M. R.; Wahl, E.; Cunningham, K. L.

    2013-12-01

    Carbon isotopic ratio measurements of CO2 and CH4 provide valuable insight into carbon cycle processes. However, many of these studies, like soil gas, soil flux, and water head space experiments, provide very small gas sample volumes, too small for direct measurement by current constant-flow Cavity Ring-Down (CRDS) isotopic analyzers. Previously, we addressed this issue by developing a sample introduction module which enabled the isotopic ratio measurement of 40ml samples or smaller. However, the system, called the Small Sample Isotope Module (SSIM), does dilute the sample during the delivery with inert carrier gas which causes a ~5% reduction in concentration. The isotopic ratio measurements are not affected by this small dilution, but researchers are naturally interested accurate concentration measurements. We present the accuracy and precision of a new method of using this delivery module which we call 'double injection.' Two portions of the 40ml of the sample (20ml each) are introduced to the analyzer, the first injection of which flushes out the diluting gas and the second injection is measured. The accuracy of this new method is demonstrated by comparing the concentration and isotopic ratio measurements for a gas sampled directly and that same gas measured through the SSIM. The data show that the CO2 concentration measurements were the same within instrument precision. The isotopic ratio precision (1σ) of repeated measurements was 0.16 permil for CO2 and 1.15 permil for CH4 at ambient concentrations. This new method provides a significant enhancement in the information provided by small samples.

  8. Application of the wavelet packet transform to vibration signals for surface roughness monitoring in CNC turning operations

    NASA Astrophysics Data System (ADS)

    García Plaza, E.; Núñez López, P. J.

    2018-01-01

    The wavelet packet transform method decomposes a time signal into several independent time-frequency signals called packets. This enables the temporary location of transient events occurring during the monitoring of the cutting processes, which is advantageous in monitoring condition and fault diagnosis. This paper proposes the monitoring of surface roughness using a single low cost sensor that is easily implemented in numerical control machine tools in order to make on-line decisions on workpiece surface finish quality. Packet feature extraction in vibration signals was applied to correlate the sensor signals to measured surface roughness. For the successful application of the WPT method, mother wavelets, packet decomposition level, and appropriate packet selection methods should be considered, but are poorly understood aspects in the literature. In this novel contribution, forty mother wavelets, optimal decomposition level, and packet reduction methods were analysed, as well as identifying the effective frequency range providing the best packet feature extraction for monitoring surface finish. The results show that mother wavelet biorthogonal 4.4 in decomposition level L3 with the fusion of the orthogonal vibration components (ax + ay + az) were the best option in the vibration signal and surface roughness correlation. The best packets were found in the medium-high frequency DDA (6250-9375 Hz) and high frequency ADA (9375-12500 Hz) ranges, and the feed acceleration component ay was the primary source of information. The packet reduction methods forfeited packets with relevant features to the signal, leading to poor results for the prediction of surface roughness. WPT is a robust vibration signal processing method for the monitoring of surface roughness using a single sensor without other information sources, satisfactory results were obtained in comparison to other processing methods with a low computational cost.

  9. Who Are We Not Calling On? A Study of Classroom Participation and the Implementation of the Name Card Method.

    ERIC Educational Resources Information Center

    Carter, Angela

    This study involved observing a second-grade classroom to investigate how the teacher called on students, noting whether the teacher gave enough attention to students who raised their hands frequently by calling on them and examining students' responses when called on. Researchers implemented a new method of calling on students using name cards,…

  10. EFFECTS OF RESPONDING TO A NAME AND GROUP CALL ON PRESCHOOLERS' COMPLIANCE

    PubMed Central

    Beaulieu, Lauren; Hanley, Gregory P.; Roberson, Aleasha A.

    2012-01-01

    We assessed teacher–child relations with respect to children's name calls, instructions, and compliance in a preschool classroom. The most frequent consequence to a child's name being called was the provision of instructions. We also observed a higher probability of compliance when children attended to a name call. Next, we evaluated the effects of teaching preschoolers to attend to their names and a group call on their compliance with typical instructions. We used a multiple baseline design across subjects and a control-group design to evaluate whether gains in compliance were a function of treatment or routine experience in preschool. Results showed that compliance increased as a function of teaching precursors for all children in the experimental group, and the effects on compliance were maintained despite a reduction of the occurrence of precursors. Moreover, it appeared that precursor teaching, not routine preschool experience, was responsible for the changes in compliance. PMID:23322926

  11. Effects of responding to a name and group call on preschoolers' compliance.

    PubMed

    Beaulieu, Lauren; Hanley, Gregory P; Roberson, Aleasha A

    2012-01-01

    We assessed teacher-child relations with respect to children's name calls, instructions, and compliance in a preschool classroom. The most frequent consequence to a child's name being called was the provision of instructions. We also observed a higher probability of compliance when children attended to a name call. Next, we evaluated the effects of teaching preschoolers to attend to their names and a group call on their compliance with typical instructions. We used a multiple baseline design across subjects and a control-group design to evaluate whether gains in compliance were a function of treatment or routine experience in preschool. Results showed that compliance increased as a function of teaching precursors for all children in the experimental group, and the effects on compliance were maintained despite a reduction of the occurrence of precursors. Moreover, it appeared that precursor teaching, not routine preschool experience, was responsible for the changes in compliance.

  12. An EGO-like optimization framework for sensor placement optimization in modal analysis

    NASA Astrophysics Data System (ADS)

    Morlier, Joseph; Basile, Aniello; Chiplunkar, Ankit; Charlotte, Miguel

    2018-07-01

    In aircraft design, ground/flight vibration tests are conducted to extract aircraft’s modal parameters (natural frequencies, damping ratios and mode shapes) also known as the modal basis. The main problem in aircraft modal identification is the large number of sensors needed, which increases operational time and costs. The goal of this paper is to minimize the number of sensors by optimizing their locations in order to reconstruct a truncated modal basis of N mode shapes with a high level of accuracy in the reconstruction. There are several methods to solve sensors placement optimization (SPO) problems, but for this case an original approach has been established based on an iterative process for mode shapes reconstruction through an adaptive Kriging metamodeling approach so called efficient global optimization (EGO)-SPO. The main idea in this publication is to solve an optimization problem where the sensors locations are variables and the objective function is defined by maximizing the trace of criteria so called AutoMAC. The results on a 2D wing demonstrate a reduction of sensors by 30% using our EGO-SPO strategy.

  13. 76 FR 66913 - Caribbean Fishery Management Council; Catch Share Panel Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-28

    ... discuss the following agenda items: --Call To Order --Trap Reduction Program Discussion --Other Issues... interpretation will be provided (English-Spanish). For more information or request for sign language...

  14. Accounting for GC-content bias reduces systematic errors and batch effects in ChIP-seq data.

    PubMed

    Teng, Mingxiang; Irizarry, Rafael A

    2017-11-01

    The main application of ChIP-seq technology is the detection of genomic regions that bind to a protein of interest. A large part of functional genomics' public catalogs is based on ChIP-seq data. These catalogs rely on peak calling algorithms that infer protein-binding sites by detecting genomic regions associated with more mapped reads (coverage) than expected by chance, as a result of the experimental protocol's lack of perfect specificity. We find that GC-content bias accounts for substantial variability in the observed coverage for ChIP-seq experiments and that this variability leads to false-positive peak calls. More concerning is that the GC effect varies across experiments, with the effect strong enough to result in a substantial number of peaks called differently when different laboratories perform experiments on the same cell line. However, accounting for GC content bias in ChIP-seq is challenging because the binding sites of interest tend to be more common in high GC-content regions, which confounds real biological signals with unwanted variability. To account for this challenge, we introduce a statistical approach that accounts for GC effects on both nonspecific noise and signal induced by the binding site. The method can be used to account for this bias in binding quantification as well to improve existing peak calling algorithms. We use this approach to show a reduction in false-positive peaks as well as improved consistency across laboratories. © 2017 Teng and Irizarry; Published by Cold Spring Harbor Laboratory Press.

  15. Paramedic-Initiated Home Care Referrals and Use of Home Care and Emergency Medical Services.

    PubMed

    Verma, Amol A; Klich, John; Thurston, Adam; Scantlebury, Jordan; Kiss, Alex; Seddon, Gayle; Sinha, Samir K

    2018-01-01

    We examined the association between paramedic-initiated home care referrals and utilization of home care, 9-1-1, and Emergency Department (ED) services. This was a retrospective cohort study of individuals who received a paramedic-initiated home care referral after a 9-1-1 call between January 1, 2011 and December 31, 2012 in Toronto, Ontario, Canada. Home care, 9-1-1, and ED utilization were compared in the 6 months before and after home care referral. Nonparametric longitudinal regression was performed to assess changes in hours of home care service use and zero-inflated Poisson regression was performed to assess changes in the number of 9-1-1 calls and ambulance transports to ED. During the 24-month study period, 2,382 individuals received a paramedic-initiated home care referral. After excluding individuals who died, were hospitalized, or were admitted to a nursing home, the final study cohort was 1,851. The proportion of the study population receiving home care services increased from 18.2% to 42.5% after referral, representing 450 additional people receiving services. In longitudinal regression analysis, there was an increase of 17.4 hours in total services per person in the six months after referral (95% CI: 1.7-33.1, p = 0.03). The mean number of 9-1-1 calls per person was 1.44 (SD 9.58) before home care referral and 1.20 (SD 7.04) after home care referral in the overall study cohort. This represented a 10% reduction in 9-1-1 calls (95% CI: 7-13%, p < 0.001) in Poisson regression analysis. The mean number of ambulance transports to ED per person was 0.91 (SD 8.90) before home care referral and 0.79 (SD 6.27) after home care referral, representing a 7% reduction (95% CI: 3-11%, p < 0.001) in Poisson regression analysis. When only the participants with complete paramedic and home care records were included in the analysis, the reductions in 9-1-1 calls and ambulance transports to ED were attenuated but remained statistically significant. Paramedic-initiated home care referrals in Toronto were associated with improved access to and use of home care services and may have been associated with reduced 9-1-1 calls and ambulance transports to ED.

  16. Sound imaging of nocturnal animal calls in their natural habitat.

    PubMed

    Mizumoto, Takeshi; Aihara, Ikkyu; Otsuka, Takuma; Takeda, Ryu; Aihara, Kazuyuki; Okuno, Hiroshi G

    2011-09-01

    We present a novel method for imaging acoustic communication between nocturnal animals. Investigating the spatio-temporal calling behavior of nocturnal animals, e.g., frogs and crickets, has been difficult because of the need to distinguish many animals' calls in noisy environments without being able to see them. Our method visualizes the spatial and temporal dynamics using dozens of sound-to-light conversion devices (called "Firefly") and an off-the-shelf video camera. The Firefly, which consists of a microphone and a light emitting diode, emits light when it captures nearby sound. Deploying dozens of Fireflies in a target area, we record calls of multiple individuals through the video camera. We conduct two experiments, one indoors and the other in the field, using Japanese tree frogs (Hyla japonica). The indoor experiment demonstrates that our method correctly visualizes Japanese tree frogs' calling behavior. It has confirmed the known behavior; two frogs call synchronously or in anti-phase synchronization. The field experiment (in a rice paddy where Japanese tree frogs live) also visualizes the same calling behavior to confirm anti-phase synchronization in the field. Experimental results confirm that our method can visualize the calling behavior of nocturnal animals in their natural habitat.

  17. [Cancellations in pediatric surgery].

    PubMed

    González Landa, G; Sánchez-Ruiz, I; San Sebastián, J A; Busturia, P; Cuesta, E; Prado, C; Azcona, I

    1998-07-01

    Cancellations have an important role in the effectiveness of the surgical schedule. Analyze the evolution of the cancellations during the years 1994-1997, after institution of several corrective measures. We started in 1994 together with the Quality Unit of the Hospital, a program to reduce surgical cancellations. It consisted in improvement of parents information, surgical schedule centralization, and increasing the awareness of the importance of constant improvement. Cancellations have been divided in: inevitables and inevitables, and analyzed for the entire Service and for each speciality. A significant reduction of cancellations have been obtain (from 12.38% in 1992 to 3.35% in 1997). The inevitable causes (no presentation, inadequate preparation and lack of time in the surgical room) have shown the most improvement. Although the inevitable causes (intercurrent disease) were also significantly improved, after obtaining prior information of the health of the child, by telephone call or parents advise. ENT is the speciality with greatest improvement by significant reduction of the non presentation and intercurrent disease. After corrective measures conjointly with a realistic surgical schedule, prior telephone call and improvement quality concept, have permitted a significant reduction of cancellations.

  18. The anxiolitic effects of BTG1640 and BTG1675A on ultrasonic isolation calls and locomotor activity of rat pups.

    PubMed

    Niculescu, M; Cagiano, R; Caprio, M; Damian, S; Boia, E; Vermesan, D; Tattoli, M; Haragus, H

    2016-12-01

    The aim of the present study was to evaluate the anxiolytic properties of the new isoxazoline compounds BTG1640 and BTG1675A in comparison with diazepam. We evaluated the ultrasonic distress emission in both sexes of neonatal rat pups (which seems to be a sensitive indicator of the rat emotional reactivity and represents a valuable tool to screen compounds with expected anxiolytic properties) and the locomotor activity in 30-day old rat pups. We found a significant reduction in the number of emitted ultrasonic calls only after i.p. administration of diazepam 1 mg/kg, while no significant reduction have been detected after i.p. administration of BTG 1640 and BTG 1675A. Furthermore, we found a significant reduction of locomotor activity in the first 10' of the test, only in the group treated with diazepam 0.1 mg. The tests validating the supposed anxiolytic properties of the new isoxazoline compounds BTG1640 and BTG1675A, in comparison with diazepam, gave negative results.

  19. Approximate Genealogies Under Genetic Hitchhiking

    PubMed Central

    Pfaffelhuber, P.; Haubold, B.; Wakolbinger, A.

    2006-01-01

    The rapid fixation of an advantageous allele leads to a reduction in linked neutral variation around the target of selection. The genealogy at a neutral locus in such a selective sweep can be simulated by first generating a random path of the advantageous allele's frequency and then a structured coalescent in this background. Usually the frequency path is approximated by a logistic growth curve. We discuss an alternative method that approximates the genealogy by a random binary splitting tree, a so-called Yule tree that does not require first constructing a frequency path. Compared to the coalescent in a logistic background, this method gives a slightly better approximation for identity by descent during the selective phase and a much better approximation for the number of lineages that stem from the founder of the selective sweep. In applications such as the approximation of the distribution of Tajima's D, the two approximation methods perform equally well. For relevant parameter ranges, the Yule approximation is faster. PMID:17182733

  20. Extremophiles as sources of inorganic bio-nanoparticles.

    PubMed

    Beeler, Erik; Singh, Om V

    2016-09-01

    Industrial use of nanotechnology in daily life has produced an emphasis on the safe and efficient production of nanoparticles (NPs). Traditional chemical oxidation and reduction methods are seen as inefficient, environmentally unsound, and often dangerous to those exposed and involved in NP manufacturing. However, utilizing microorganisms for biosynthesis of NPs allows efficient green production of a range of inorganic NPs, while maintaining specific size, shape, stability, and dispersity. Microorganisms living under harsh environmental conditions, called "Extremophiles," are one group of microorganisms being utilized for this biosynthesis. Extremophiles' unique living conditions have endowed them with various processes that enable NP biosynthesis. This includes a range of extremophiles: thermophiles, acidophilus, halophiles, psychrophiles, anaerobes, and some others. Fungi, bacteria, yeasts, and archaea, i.e. Ureibacillus thermosphaericus, and Geobacillus stearothermophilus, among others, have been established for NP biosynthesis. This article highlights the extremophiles and methods found to be viable candidates for the production of varying types of NPs, as well as interpreting selective methods used by the organisms to synthesize NPs.

  1. Cox-nnet: An artificial neural network method for prognosis prediction of high-throughput omics data

    PubMed Central

    Ching, Travers; Zhu, Xun

    2018-01-01

    Artificial neural networks (ANN) are computing architectures with many interconnections of simple neural-inspired computing elements, and have been applied to biomedical fields such as imaging analysis and diagnosis. We have developed a new ANN framework called Cox-nnet to predict patient prognosis from high throughput transcriptomics data. In 10 TCGA RNA-Seq data sets, Cox-nnet achieves the same or better predictive accuracy compared to other methods, including Cox-proportional hazards regression (with LASSO, ridge, and mimimax concave penalty), Random Forests Survival and CoxBoost. Cox-nnet also reveals richer biological information, at both the pathway and gene levels. The outputs from the hidden layer node provide an alternative approach for survival-sensitive dimension reduction. In summary, we have developed a new method for accurate and efficient prognosis prediction on high throughput data, with functional biological insights. The source code is freely available at https://github.com/lanagarmire/cox-nnet. PMID:29634719

  2. Variance based joint sparsity reconstruction of synthetic aperture radar data for speckle reduction

    NASA Astrophysics Data System (ADS)

    Scarnati, Theresa; Gelb, Anne

    2018-04-01

    In observing multiple synthetic aperture radar (SAR) images of the same scene, it is apparent that the brightness distributions of the images are not smooth, but rather composed of complicated granular patterns of bright and dark spots. Further, these brightness distributions vary from image to image. This salt and pepper like feature of SAR images, called speckle, reduces the contrast in the images and negatively affects texture based image analysis. This investigation uses the variance based joint sparsity reconstruction method for forming SAR images from the multiple SAR images. In addition to reducing speckle, the method has the advantage of being non-parametric, and can therefore be used in a variety of autonomous applications. Numerical examples include reconstructions of both simulated phase history data that result in speckled images as well as the images from the MSTAR T-72 database.

  3. Functional feature embedded space mapping of fMRI data.

    PubMed

    Hu, Jin; Tian, Jie; Yang, Lei

    2006-01-01

    We have proposed a new method for fMRI data analysis which is called Functional Feature Embedded Space Mapping (FFESM). Our work mainly focuses on the experimental design with periodic stimuli which can be described by a number of Fourier coefficients in the frequency domain. A nonlinear dimension reduction technique Isomap is applied to the high dimensional features obtained from frequency domain of the fMRI data for the first time. Finally, the presence of activated time series is identified by the clustering method in which the information theoretic criterion of minimum description length (MDL) is used to estimate the number of clusters. The feasibility of our algorithm is demonstrated by real human experiments. Although we focus on analyzing periodic fMRI data, the approach can be extended to analyze non-periodic fMRI data (event-related fMRI) by replacing the Fourier analysis with a wavelet analysis.

  4. Analysis and Assessment of Environmental Load of Vending Machines by a LCA Method, and Eco-Improvement Effect

    NASA Astrophysics Data System (ADS)

    Kimura, Yukio; Sadamichi, Yucho; Maruyama, Naoki; Kato, Seizo

    These days the environmental impact due to vending machines'(VM) diffusion has greatly been discussed. This paper describes the numerical evaluation of the environmental impact by using the LCA (Life Cycle Assessment) scheme and then proposes eco-improvements' strategy toward environmentally conscious products(ECP). A new objective and universal consolidated method for the LCA-evaluation, so-called LCA-NETS(Numerical Eco-load Standardization ) developed by the authors is applied to the present issue. As a result, the environmental loads at the 5years' operation and the material procurement stages are found to dominate others over the life cycle. Further eco-improvement is realized by following the order of the LCA-NETS magnitude; namely, energy saving, materials reducing, parts' re-using, and replacing with low environmental load material. Above all, parts' re-using is specially recommendable for significant reduction of the environmental loads toward ECP.

  5. Improving the efficiency of a chemotherapy day unit: applying a business approach to oncology.

    PubMed

    van Lent, Wineke A M; Goedbloed, N; van Harten, W H

    2009-03-01

    To improve the efficiency of a hospital-based chemotherapy day unit (CDU). The CDU was benchmarked with two other CDUs to identify their attainable performance levels for efficiency, and causes for differences. Furthermore, an in-depth analysis using a business approach, called lean thinking, was performed. An integrated set of interventions was implemented, among them a new planning system. The results were evaluated using pre- and post-measurements. We observed 24% growth of treatments and bed utilisation, a 12% increase of staff member productivity and an 81% reduction of overtime. The used method improved process design and led to increased efficiency and a more timely delivery of care. Thus, the business approaches, which were adapted for healthcare, were successfully applied. The method may serve as an example for other oncology settings with problems concerning waiting times, patient flow or lack of beds.

  6. 53rd Course Molecular Physics and Plasmas in Hypersonics 2

    DTIC Science & Technology

    2013-09-09

    between CO2 symmetric and bending modes ( 11 ) proceeds fast due to the Fermi resonance between the frequencies of these modes and can be considered as...of local maximization of the collision frequency given by Eq. ( 11 ) allows a strong reduction of the computational cost and it is verified a...called arc-jets or DC-Plasmatron [25, 26]. PWTs using Inductively Coupled Plasma (ICP) torch, based on Radio - Frequency (RF) discharge, are so- called

  7. CEE/CA: Report calls for decriminalization of sex work.

    PubMed

    Betteridge, Glenn

    2006-04-01

    In December 2005, the Central and Eastern European Harm Reduction Network (CEEHRN) released a report calling for the decriminalization of sex work in the 27 countries of Central and Eastern Europe and Central Asia (CEE/CA). The report brings together a wealth of published and original information concerning sex work, laws regulating sex work, epidemiological data regarding HIV and other sexually transmitted infections (STIs), services available to sex workers, and human rights abuses faced by sex workers.

  8. Improved variance estimation of classification performance via reduction of bias caused by small sample size.

    PubMed

    Wickenberg-Bolin, Ulrika; Göransson, Hanna; Fryknäs, Mårten; Gustafsson, Mats G; Isaksson, Anders

    2006-03-13

    Supervised learning for classification of cancer employs a set of design examples to learn how to discriminate between tumors. In practice it is crucial to confirm that the classifier is robust with good generalization performance to new examples, or at least that it performs better than random guessing. A suggested alternative is to obtain a confidence interval of the error rate using repeated design and test sets selected from available examples. However, it is known that even in the ideal situation of repeated designs and tests with completely novel samples in each cycle, a small test set size leads to a large bias in the estimate of the true variance between design sets. Therefore different methods for small sample performance estimation such as a recently proposed procedure called Repeated Random Sampling (RSS) is also expected to result in heavily biased estimates, which in turn translates into biased confidence intervals. Here we explore such biases and develop a refined algorithm called Repeated Independent Design and Test (RIDT). Our simulations reveal that repeated designs and tests based on resampling in a fixed bag of samples yield a biased variance estimate. We also demonstrate that it is possible to obtain an improved variance estimate by means of a procedure that explicitly models how this bias depends on the number of samples used for testing. For the special case of repeated designs and tests using new samples for each design and test, we present an exact analytical expression for how the expected value of the bias decreases with the size of the test set. We show that via modeling and subsequent reduction of the small sample bias, it is possible to obtain an improved estimate of the variance of classifier performance between design sets. However, the uncertainty of the variance estimate is large in the simulations performed indicating that the method in its present form cannot be directly applied to small data sets.

  9. Cure-WISE: HETDEX Data Reduction with Astro-WISE

    NASA Astrophysics Data System (ADS)

    Snigula, J. M.; Drory, N.; Fabricius, M.; Landriau, M.; Montesano, F.; Hill, G. J.; Gebhardt, K.; Cornell, M. E.

    2014-05-01

    The Hobby-Eberly Telescope Dark Energy Experiment (HETDEX, Hill et al. 2012b) is a blind spectroscopic survey to map the evolution of dark energy using Lyman-alpha emitting galaxies at redshifts 1.9< ɀ <3.5 as tracers. The survey will use an array of 75 integral field spectrographs called the Visible Integral field Replicable Unit (IFU) Spectrograph (VIRUS, Hill et al. 2012c). The 10m HET (Ramsey et al. 1998) currently receives a wide-field upgrade (Hill et al. 2012a) to accomodate the spectrographs and to provide the needed field of view. Over the projected five year run of the survey we expect to obtain approximately 170 GB of data each night. For the data reduction we developed the Cure pipeline, to automatically find and calibrate the observed spectra, subtract the sky background, and detect and classify different types of sources. Cure employs rigorous statistical methods and complete pixel-level error propagation throughout the reduction process to ensure Poisson-limited performance and meaningful significance values. To automate the reduction of the whole dataset we implemented the Cure pipeline in the Astro-WISE framework. This integration provides for HETDEX a database backend with complete dependency tracking of the various reduction steps, automated checks, and a searchable interface to the detected sources and user management. It can be used to create various web interfaces for data access and quality control. Astro-WISE allows us to reduce the data from all the IFUs in parallel on a compute cluster. This cluster allows us to reduce the observed data in quasi real time and still have excess capacity for rerunning parts of the reduction. Finally, the Astro-WISE interface will be used to provide access to reduced data products to the general community.

  10. Ultrafast Scavenging of the Precursor of H(•) Atom, (e(-), H3O(+)), in Aqueous Solutions.

    PubMed

    Balcerzyk, Anna; Schmidhammer, Uli; Wang, Furong; de la Lande, Aurélien; Mostafavi, Mehran

    2016-09-01

    Picosecond pulse radiolysis measurements have been performed in several highly concentrated HClO4 and H3PO4 aqueous solutions containing silver ions at different concentrations. Silver ion reduction is used to unravel the ultrafast reduction reactions observed at the end of a 7 ps electron pulse. Solvated electrons and silver atoms are observed by the pulse (electron beam)-probe (supercontinuum light) method. In highly acidic solutions, ultrafast reduction of silver ions is observed, a finding that is not compatible with a reaction between the H(•) atom and silver ions, which is known to be thermally activated. In addition, silver ion reduction is found to be even more efficient in phosphoric acid solution than that in neutral solution. In the acidic solutions investigated here, the species responsible for the reduction of silver atoms is considered to be the precursor of the H(•) atom. This precursor, denoted (e(-), H3O(+)), is a pair constituting an electron (not fully solvated) and H3O(+). Its structure differs from that of the pair of a solvated electron and a hydronium ion (es(-), H3O(+)), which absorbs in the visible region. The (e(-), H3O(+)) pair , called the pre-H(•) atom here, undergoes ultrafast electron transfer and can, like the presolvated electron, reduce silver ions much faster than the H(•) atom. Moreover, it is found that with the same concentration of H3O(+) the reduction reaction is favored in the phosphoric acid solution compared to that in the perchloric acid solution because of the less-efficient electron solvation process. The kinetics show that among the three reducing species, (e(-), H3O(+)), (es(-), H3O(+)), and H(•) atom, the first one is the most efficient.

  11. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  12. Mode Reduction and Upscaling of Reactive Transport Under Incomplete Mixing

    NASA Astrophysics Data System (ADS)

    Lester, D. R.; Bandopadhyay, A.; Dentz, M.; Le Borgne, T.

    2016-12-01

    Upscaling of chemical reactions in partially-mixed fluid environments is a challenging problem due to the detailed interactions between inherently nonlinear reaction kinetics and complex spatio-temporal concentration distributions under incomplete mixing. We address this challenge via the development of an order reduction method for the advection-diffusion-reaction equation (ADRE) via projection of the reaction kinetics onto a small number N of leading eigenmodes of the advection-diffusion operator (the so-called "strange eigenmodes" of the flow) as an N-by-N nonlinear system, whilst mixing dynamics only are projected onto the remaining modes. For simple kinetics and moderate Péclet and Damkhöler numbers, this approach yields analytic solutions for the concentration mean, evolving spatio-temporal distribution and PDF in terms of the well-mixed reaction kinetics and mixing dynamics. For more complex kinetics or large Péclet or Damkhöler numbers only a small number of modes are required to accurately quantify the mixing and reaction dynamics in terms of the concentration field and PDF, facilitating greatly simplified approximation and analysis of reactive transport. Approximate solutions of this low-order nonlinear system provide quantiative predictions of the evolving concentration PDF. We demonstrate application of this method to a simple random flow and various mass-action reaction kinetics.

  13. Local Context Finder (LCF) reveals multidimensional relationships among mRNA expression profiles of Arabidopsis responding to pathogen infection

    PubMed Central

    Katagiri, Fumiaki; Glazebrook, Jane

    2003-01-01

    A major task in computational analysis of mRNA expression profiles is definition of relationships among profiles on the basis of similarities among them. This is generally achieved by pattern recognition in the distribution of data points representing each profile in a high-dimensional space. Some drawbacks of commonly used pattern recognition algorithms stem from their use of a globally linear space and/or limited degrees of freedom. A pattern recognition method called Local Context Finder (LCF) is described here. LCF uses nonlinear dimensionality reduction for pattern recognition. Then it builds a network of profiles based on the nonlinear dimensionality reduction results. LCF was used to analyze mRNA expression profiles of the plant host Arabidopsis interacting with the bacterial pathogen Pseudomonas syringae. In one case, LCF revealed two dimensions essential to explain the effects of the NahG transgene and the ndr1 mutation on resistant and susceptible responses. In another case, plant mutants deficient in responses to pathogen infection were classified on the basis of LCF analysis of their profiles. The classification by LCF was consistent with the results of biological characterization of the mutants. Thus, LCF is a powerful method for extracting information from expression profile data. PMID:12960373

  14. Complete reduction of high-density UO2 to metallic U in molten Li2O-LiCl

    NASA Astrophysics Data System (ADS)

    Choi, Eun-Young; Lee, Jeong

    2017-10-01

    The large size and high density of spent fuel pellets make it difficult to use the pellets directly in electrolytic reduction (also called as oxide reduction, OR) for pyroprocessing owing to the slow diffusion of molten Li2O-LiCl salt electrolyte into the pellets. In this study, we investigated complete OR of high-density UO2 to metallic U without any remaining UO2. Only partial reductions near the surface of high-density UO2 pellets were observed under operation conditions employing fast electrolysis rate that allowed previously complete reduction of low-density UO2 pellets. Complete reduction of high-density UO2 pellets was observed at fast electrolysis rate when the pellet size was reduced. The complete reduction of high-density UO2 pellets without size reduction was achieved at slow electrolysis rate, which allowed sufficient chemical reduction of UO2 with the lithium metal generated by the cathode reaction.

  15. A machine learning heuristic to identify biologically relevant and minimal biomarker panels from omics data

    PubMed Central

    2015-01-01

    Background Investigations into novel biomarkers using omics techniques generate large amounts of data. Due to their size and numbers of attributes, these data are suitable for analysis with machine learning methods. A key component of typical machine learning pipelines for omics data is feature selection, which is used to reduce the raw high-dimensional data into a tractable number of features. Feature selection needs to balance the objective of using as few features as possible, while maintaining high predictive power. This balance is crucial when the goal of data analysis is the identification of highly accurate but small panels of biomarkers with potential clinical utility. In this paper we propose a heuristic for the selection of very small feature subsets, via an iterative feature elimination process that is guided by rule-based machine learning, called RGIFE (Rule-guided Iterative Feature Elimination). We use this heuristic to identify putative biomarkers of osteoarthritis (OA), articular cartilage degradation and synovial inflammation, using both proteomic and transcriptomic datasets. Results and discussion Our RGIFE heuristic increased the classification accuracies achieved for all datasets when no feature selection is used, and performed well in a comparison with other feature selection methods. Using this method the datasets were reduced to a smaller number of genes or proteins, including those known to be relevant to OA, cartilage degradation and joint inflammation. The results have shown the RGIFE feature reduction method to be suitable for analysing both proteomic and transcriptomics data. Methods that generate large ‘omics’ datasets are increasingly being used in the area of rheumatology. Conclusions Feature reduction methods are advantageous for the analysis of omics data in the field of rheumatology, as the applications of such techniques are likely to result in improvements in diagnosis, treatment and drug discovery. PMID:25923811

  16. A method for reduction of Acoustic Emission (AE) data with application in machine failure detection and diagnosis

    NASA Astrophysics Data System (ADS)

    Vicuña, Cristián Molina; Höweler, Christoph

    2017-12-01

    The use of AE in machine failure diagnosis has increased over the last years. Most AE-based failure diagnosis strategies use digital signal processing and thus require the sampling of AE signals. High sampling rates are required for this purpose (e.g. 2 MHz or higher), leading to streams of large amounts of data. This situation is aggravated if fine resolution and/or multiple sensors are required. These facts combine to produce bulky data, typically in the range of GBytes, for which sufficient storage space and efficient signal processing algorithms are required. This situation probably explains why, in practice, AE-based methods consist mostly in the calculation of scalar quantities such as RMS and Kurtosis, and the analysis of their evolution in time. While the scalar-based approach offers the advantage of maximum data reduction; it has the disadvantage that most part of the information contained in the raw AE signal is lost unrecoverably. This work presents a method offering large data reduction, while keeping the most important information conveyed by the raw AE signal, useful for failure detection and diagnosis. The proposed method consist in the construction of a synthetic, unevenly sampled signal which envelopes the AE bursts present on the raw AE signal in a triangular shape. The constructed signal - which we call TriSignal - also permits the estimation of most scalar quantities typically used for failure detection. But more importantly, it contains the information of the time of occurrence of the bursts, which is key for failure diagnosis. Lomb-Scargle normalized periodogram is used to construct the TriSignal spectrum, which reveals the frequency content of the TriSignal and provides the same information as the classic AE envelope. The paper includes application examples in planetary gearbox and low-speed rolling element bearing.

  17. Direct Telephonic Communication in a Heart Failure Transitional Care Program: An observational study.

    PubMed

    Ota, Ken S; Beutler, David S; Sheikh, Hassam; Weiss, Jessica L; Parkinson, Dallin; Nguyen, Peter; Gerkin, Richard D; Loli, Akil I

    2013-10-01

    This study investigated the trend of phone calls in the Banner Good Samaritan Medical Center (BGSMC) Heart Failure Transitional Care Program (HFTCP). The primary goal of the HFTCP is to reduce 30-Day readmissions for heart failure patients by using a multi-pronged approach. This study included 104 patients in the HFTCP discharged over a 51-week period who had around-the-clock telephone access to the Transitionalist. Cellular phone records were reviewed. This study evaluated the length and timing of calls. A total of 4398 telephone calls were recorded of which 39% were inbound and 61% were outbound. This averaged to 86 calls per week. During the "Weekday Daytime" period, Eighty-five percent of the totals calls were made. There were 229 calls during the "Weekday Nights" period with 1.5 inbound calls per week. The "Total Weekend" calls were 10.2% of the total calls which equated to a weekly average of 8.8. Our experience is that direct, physician-patient telephone contact is feasible with a panel of around 100 HF patients for one provider. If the proper financial reimbursements are provided, physicians may be apt to participate in similar transitional care programs. Likewise, third party payers will benefit from the reduction in unnecessary emergency room visits and hospitalizations.

  18. Reliable vision-guided grasping

    NASA Technical Reports Server (NTRS)

    Nicewarner, Keith E.; Kelley, Robert B.

    1992-01-01

    Automated assembly of truss structures in space requires vision-guided servoing for grasping a strut when its position and orientation are uncertain. This paper presents a methodology for efficient and robust vision-guided robot grasping alignment. The vision-guided grasping problem is related to vision-guided 'docking' problems. It differs from other hand-in-eye visual servoing problems, such as tracking, in that the distance from the target is a relevant servo parameter. The methodology described in this paper is hierarchy of levels in which the vision/robot interface is decreasingly 'intelligent,' and increasingly fast. Speed is achieved primarily by information reduction. This reduction exploits the use of region-of-interest windows in the image plane and feature motion prediction. These reductions invariably require stringent assumptions about the image. Therefore, at a higher level, these assumptions are verified using slower, more reliable methods. This hierarchy provides for robust error recovery in that when a lower-level routine fails, the next-higher routine will be called and so on. A working system is described which visually aligns a robot to grasp a cylindrical strut. The system uses a single camera mounted on the end effector of a robot and requires only crude calibration parameters. The grasping procedure is fast and reliable, with a multi-level error recovery system.

  19. Projection methods for line radiative transfer in spherical media.

    NASA Astrophysics Data System (ADS)

    Anusha, L. S.; Nagendra, K. N.

    An efficient numerical method called the Preconditioned Bi-Conjugate Gradient (Pre-BiCG) method is presented for the solution of radiative transfer equation in spherical geometry. A variant of this method called Stabilized Preconditioned Bi-Conjugate Gradient (Pre-BiCG-STAB) is also presented. These methods are based on projections on the subspaces of the n dimensional Euclidean space mathbb {R}n called Krylov subspaces. The methods are shown to be faster in terms of convergence rate compared to the contemporary iterative methods such as Jacobi, Gauss-Seidel and Successive Over Relaxation (SOR).

  20. Developing Louisiana crash reduction factors.

    DOT National Transportation Integrated Search

    2013-10-01

    The Louisiana Strategic Highway Safety Plan is to reach the goal of Destination Zero Death on Louisiana : roadways. This tall order calls for implementing all feasible crash countermeasures. A great number of crash : countermeasures have been identif...

  1. MagneMotion urban maglev : final report

    DOT National Transportation Integrated Search

    2004-11-01

    The MagneMotion Urban Maglev System, called M3, is designed as an alternative to all conventional guided transportation systems. Advantages include major reductions in travel time, operating cost, capital cost, noise, and energy consumption. Small ve...

  2. 48 CFR 352.201-70 - Paperwork Reduction Act.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... record information calling either for answers to identical questions from 10 or more persons other than... Act of 1995 (44 U.S.C. 3501 et seq.) shall apply to this contract. No plan, questionnaire, interview...

  3. 48 CFR 352.201-70 - Paperwork Reduction Act.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... record information calling either for answers to identical questions from 10 or more persons other than... Act of 1995 (44 U.S.C. 3501 et seq.) shall apply to this contract. No plan, questionnaire, interview...

  4. Economic evaluation of the differential benefits of home visits with telephone calls and telephone calls only in transitional discharge support.

    PubMed

    Wong, Frances Kam Yuet; So, Ching; Chau, June; Law, Antony Kwan Pui; Tam, Stanley Ku Fu; McGhee, Sarah

    2015-01-01

    home visits and telephone calls are two often used approaches in transitional care, but their differential economic effects are unknown. to examine the differential economic benefits of home visits with telephone calls and telephone calls only in transitional discharge support. cost-effectiveness analysis conducted alongside a randomised controlled trial (RCT). patients discharged from medical units randomly assigned to control (control, N = 210), home visits with calls (home, N = 196) and calls only (call, N = 204). cost-effectiveness analyses were conducted from the societal perspective comparing monetary benefits and quality-adjusted life years (QALYs) gained. the home arm was less costly but less effective at 28 days and was dominating (less costly and more effective) at 84 days. The call arm was dominating at both 28 and 84 days. The incremental QALY for the home arm was -0.0002/0.0008 (28/84 days), and the call arm was 0.0022/0.0104 (28/84 days). When the three groups were compared, the call arm had a higher probability being cost-effective at 84 days but not at 28 days (home: 53%, call: 35% (28 days) versus home: 22%, call: 73% (84 days)) measuring against the NICE threshold of £20,000. the original RCT showed that the bundled intervention involving home visits and calls was more effective than calls only in the reduction of hospital readmissions. This study adds a cost perspective to inform policymakers that both home visits and calls only are cost-effective for transitional care support, but calls only have a higher chance of being cost-effective for a sustained period after intervention. © The Author 2014. Published by Oxford University Press on behalf of the British Geriatrics Society.

  5. Detecting Damage in Composite Material Using Nonlinear Elastic Wave Spectroscopy Methods

    NASA Astrophysics Data System (ADS)

    Meo, Michele; Polimeno, Umberto; Zumpano, Giuseppe

    2008-05-01

    Modern aerospace structures make increasing use of fibre reinforced plastic composites, due to their high specific mechanical properties. However, due to their brittleness, low velocity impact can cause delaminations beneath the surface, while the surface may appear to be undamaged upon visual inspection. Such damage is called barely visible impact damage (BVID). Such internal damages lead to significant reduction in local strengths and ultimately could lead to catastrophic failures. It is therefore important to detect and monitor damages in high loaded composite components to receive an early warning for a well timed maintenance of the aircraft. Non-linear ultrasonic spectroscopy methods are promising damage detection and material characterization tools. In this paper, two different non-linear elastic wave spectroscopy (NEWS) methods are presented: single mode nonlinear resonance ultrasound (NRUS) and nonlinear wave modulation technique (NWMS). The NEWS methods were applied to detect delamination damage due to low velocity impact (<12 J) on various composite plates. The results showed that the proposed methodology appear to be highly sensitive to the presence of damage with very promising future NDT and structural health monitoring applications.

  6. AIS-2 radiometry and a comparison of methods for the recovery of ground reflectance

    NASA Technical Reports Server (NTRS)

    Conel, James E.; Green, Robert O.; Vane, Gregg; Bruegge, Carol J.; Alley, Ronald E.; Curtiss, Brian J.

    1987-01-01

    A field experiment and its results involving Airborne Imaging Spectrometer-2 data are described. The radiometry and spectral calibration of the instrument are critically examined in light of laboratory and field measurements. Three methods of compensating for the atmosphere in the search for ground reflectance are compared. It was found that laboratory determined responsitivities are 30 to 50 percent less than expected for conditions of the flight for both short and long wavelength observations. The combined system atmosphere surface signal to noise ratio, as indexed by the mean response divided by the standard deviation for selected areas, lies between 40 and 110, depending upon how scene averages are taken, and is 30 percent less for flight conditions than for laboratory. Atmospheric and surface variations may contribute to this difference. It is not possible to isolate instrument performance from the present data. As for methods of data reduction, the so-called scene average or log-residual method fails to recover any feature present in the surface reflectance, probably because of the extreme homogeneity of the scene.

  7. USSR Report: Political and Sociological Affairs.

    DTIC Science & Technology

    1983-08-17

    percent; rubber articles, 55 percent; and mineral fertilizers,’ 50 percent. Naturally, this results in a reduction in the return on investments ...planned program will require the fundemental reorganization of all the work of the party’s oblast organization. And first of all — First Secretary of...fact that the plans for the 1980*s call for a relative reduction in the growth in capital investments in farming. In the opinion of the German

  8. Compressed sensing and the reconstruction of ultrafast 2D NMR data: Principles and biomolecular applications.

    PubMed

    Shrot, Yoav; Frydman, Lucio

    2011-04-01

    A topic of active investigation in 2D NMR relates to the minimum number of scans required for acquiring this kind of spectra, particularly when these are dictated by sampling rather than by sensitivity considerations. Reductions in this minimum number of scans have been achieved by departing from the regular sampling used to monitor the indirect domain, and relying instead on non-uniform sampling and iterative reconstruction algorithms. Alternatively, so-called "ultrafast" methods can compress the minimum number of scans involved in 2D NMR all the way to a minimum number of one, by spatially encoding the indirect domain information and subsequently recovering it via oscillating field gradients. Given ultrafast NMR's simultaneous recording of the indirect- and direct-domain data, this experiment couples the spectral constraints of these orthogonal domains - often calling for the use of strong acquisition gradients and large filter widths to fulfill the desired bandwidth and resolution demands along all spectral dimensions. This study discusses a way to alleviate these demands, and thereby enhance the method's performance and applicability, by combining spatial encoding with iterative reconstruction approaches. Examples of these new principles are given based on the compressed-sensed reconstruction of biomolecular 2D HSQC ultrafast NMR data, an approach that we show enables a decrease of the gradient strengths demanded in this type of experiments by up to 80%. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Beamspace fast fully adaptive brain source localization for limited data sequences

    NASA Astrophysics Data System (ADS)

    Ravan, Maryam

    2017-05-01

    In the electroencephalogram (EEG) or magnetoencephalogram (MEG) context, brain source localization methods that rely on estimating second order statistics often fail when the observations are taken over a short time interval, especially when the number of electrodes is large. To address this issue, in previous study, we developed a multistage adaptive processing called fast fully adaptive (FFA) approach that can significantly reduce the required sample support while still processing all available degrees of freedom (DOFs). This approach processes the observed data in stages through a decimation procedure. In this study, we introduce a new form of FFA approach called beamspace FFA. We first divide the brain into smaller regions and transform the measured data from the source space to the beamspace in each region. The FFA approach is then applied to the beamspaced data of each region. The goal of this modification is to benefit the correlation sensitivity reduction between sources in different brain regions. To demonstrate the performance of the beamspace FFA approach in the limited data scenario, simulation results with multiple deep and cortical sources as well as experimental results are compared with regular FFA and widely used FINE approaches. Both simulation and experimental results demonstrate that the beamspace FFA method can localize different types of multiple correlated brain sources in low signal to noise ratios more accurately with limited data.

  10. Collective phase description of oscillatory convection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawamura, Yoji, E-mail: ykawamura@jamstec.go.jp; Nakao, Hiroya

    We formulate a theory for the collective phase description of oscillatory convection in Hele-Shaw cells. It enables us to describe the dynamics of the oscillatory convection by a single degree of freedom which we call the collective phase. The theory can be considered as a phase reduction method for limit-cycle solutions in infinite-dimensional dynamical systems, namely, stable time-periodic solutions to partial differential equations, representing the oscillatory convection. We derive the phase sensitivity function, which quantifies the phase response of the oscillatory convection to weak perturbations applied at each spatial point, and analyze the phase synchronization between two weakly coupled Hele-Shawmore » cells exhibiting oscillatory convection on the basis of the derived phase equations.« less

  11. Entropy reduction via simplified image contourization

    NASA Technical Reports Server (NTRS)

    Turner, Martin J.

    1993-01-01

    The process of contourization is presented which converts a raster image into a set of plateaux or contours. These contours can be grouped into a hierarchical structure, defining total spatial inclusion, called a contour tree. A contour coder has been developed which fully describes these contours in a compact and efficient manner and is the basis for an image compression method. Simplification of the contour tree has been undertaken by merging contour tree nodes thus lowering the contour tree's entropy. This can be exploited by the contour coder to increase the image compression ratio. By applying general and simple rules derived from physiological experiments on the human vision system, lossy image compression can be achieved which minimizes noticeable artifacts in the simplified image.

  12. Competency-based education: a new model for teaching orthopaedics.

    PubMed

    Alman, Benjamin A; Ferguson, Peter; Kraemer, William; Nousiainen, Markku T; Reznick, Richard K

    2013-01-01

    The current methods used to train residents to become orthopaedic surgeons are based on tradition, not evidence-based models. Educators have only a limited ability to assess trainees for competency using validated tests in various domains. The reduction in resident work hours limits the time available for clinical training, which has resulted in some calls for lengthening the training process. Another approach to address limited training hours is to focus training in a program that allows residents to graduate from a rotation based on demonstrated competency rather than on time on a service. A pilot orthopaedic residency curriculum, which uses a competency-based framework of resident training and maximizes the use of available training hours, has been designed and is being implemented.

  13. The Role of Qualitative Approaches to Research in CALL Contexts: Closing in on the Learner's Experience

    ERIC Educational Resources Information Center

    Levy, Mike

    2015-01-01

    The article considers the role of qualitative research methods in CALL through describing a series of examples. These examples are used to highlight the importance and value of qualitative data in relation to a specific research objective in CALL. The use of qualitative methods in conjunction with other approaches as in mixed method research…

  14. Genetics Home Reference: NGLY1-congenital disorder of deglycosylation

    MedlinePlus

    ... gene. The enzyme produced from this gene, called N -glycanase 1, helps cells get rid of abnormal ... that cause NGLY1 -CDDG impair production of the N -glycanase 1 enzyme, resulting in a severe reduction ...

  15. Functional traits determine heterospecific use of risk-related social information in forest birds of tropical South-East Asia.

    PubMed

    Hua, Fangyuan; Yong, Ding Li; Janra, Muhammad Nazri; Fitri, Liza M; Prawiradilaga, Dewi; Sieving, Kathryn E

    2016-12-01

    In birds and mammals, mobbing calls constitute an important form of social information that can attract numerous sympatric species to localized mobbing aggregations. While such a response is thought to reduce the future predation risk for responding species, there is surprisingly little empirical evidence to support this hypothesis. One way to test the link between predation risk reduction and mobbing attraction involves testing the relationship between species' attraction to mobbing calls and the functional traits that define their vulnerability to predation risk. Two important traits known to influence prey vulnerability include relative prey-to-predator body size ratio and the overlap in space use between predator and prey; in combination, these measures strongly influence prey accessibility, and therefore their vulnerability, to predators. Here, we combine community surveys with behavioral experiments of a diverse bird assemblage in the lowland rainforest of Sumatra to test whether the functional traits of body mass (representing body size) and foraging height (representing space use) can predict species' attraction to heterospecific mobbing calls. At four forest sites along a gradient of forest degradation, we characterized the resident bird communities using point count and mist-netting surveys, and determined the species groups attracted to standardized playbacks of mobbing calls produced by five resident bird species of roughly similar body size and foraging height. We found that (1) a large, diverse subcommunity of bird species was attracted to the mobbing calls and (2) responding species (especially the most vigorous respondents) tended to be (a) small (b) mid-storey foragers (c) with similar trait values as the species producing the mobbing calls. Our findings from the relatively lesser known bird assemblages of tropical Asia add to the growing evidence for the ubiquity of heterospecific information networks in animal communities, and provide empirical support for the long-standing hypothesis that predation risk reduction is a major benefit of mobbing information networks.

  16. Microwave-Driven Air Plasma Studies for Drag Reduction and Power Extraction in Supersonic Air

    DTIC Science & Technology

    2004-10-15

    called spillage occurs, and the air mass capture decreases (Fig. 3). To avoid performance penalties at off-design Mach numbers, a variable geometry inlet...AND SUBTITLE 5. FUNDING NUMBERS Microwave-Driven Air Plasma Studies for Drag Reduction and Power Extraction in Supersonic Air 6. AUTHOR(S) Richard B...MONITORING AGENCY REPORT NUMBER Air Force Office of Scientific Research/NA (John Schmisseur, Program Manager) 801 N. Randolph St., Room 732 Arlington

  17. Effective normalization for copy number variation detection from whole genome sequencing.

    PubMed

    Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka

    2012-01-01

    Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls. Choice of read-count normalization methodology has a substantial effect on CNV calls and the use of genomic mappability or an appropriately chosen control genome can optimize the output of CNV analysis.

  18. Origins of Line Defects in Self-Reacting Friction Stir Welds and Their Impact on Weld Quality

    NASA Technical Reports Server (NTRS)

    Schneider, Judy; Nunes, Arthur C., Jr.

    2016-01-01

    Friction stir welding (FSWing) is a solid state joining technique which reduces the occurrence of typical defects formed in fusion welds, especially of highly alloyed metals. Although the process is robust for aluminum alloys, occasional reductions in the strength of FSWs have been observed. Shortly after the NASA-MSFC implemented a variation of FSW called self-reacting (SR), low strength properties were observed. At that time, this reduction in strength was attributed to a line defect. At that time, the limited data suggested that the line defect was related to the accumulation of native oxides that form on the weld lands and faying surfaces. Through a series of improved cleaning methods, tool redesign, and process parameter modifications, the reduction in the strength of the SR-FSWs was eliminated. As more data has been collected, the occasional reduction in the strength of SR-FSW still occurs. These occasional reductions indicate a need to reexamine the underlying causes. This study builds off a series of self reacting (SR)-FSWs that were made in 3 different thickness panels of AA2219 (0.95, 1.27 and 1.56 cm) at 2 different weld pitches. A bead on plate SR-FSW was also made in the 1.56 cm thick panel to understand the contribution of the former faying surfaces. Copper tracer studies were used to understand the flow lines associated with the weld tool used. The quality of the SR-FSWs was evaluated from tensile testing at room temperature. Reductions in the tensile strength were observed in some weldments, primarily at higher weld pitch or tool rotations. This study explores possible correlations between line defects and the reduction of strength in SR-FSWs. Results from this study will assist in a better understand of the mechanisms responsible for reduced tensile strength and provide methodology for minimizing their occurrence.

  19. Integrated Analysis of Greenhouse Gas Mitigation Options and Related Impacts

    EPA Science Inventory

    Increased concerns over air pollution (combined with detrimental health effects) and climate change have called for more stringent emission reduction strategies for criteria air pollutants and greenhouse gas emissions. However, stringent regulatory policies can possibly have a...

  20. 5-D interpolation with wave-front attributes

    NASA Astrophysics Data System (ADS)

    Xie, Yujiang; Gajewski, Dirk

    2017-11-01

    Most 5-D interpolation and regularization techniques reconstruct the missing data in the frequency domain by using mathematical transforms. An alternative type of interpolation methods uses wave-front attributes, that is, quantities with a specific physical meaning like the angle of emergence and wave-front curvatures. In these attributes structural information of subsurface features like dip and strike of a reflector are included. These wave-front attributes work on 5-D data space (e.g. common-midpoint coordinates in x and y, offset, azimuth and time), leading to a 5-D interpolation technique. Since the process is based on stacking next to the interpolation a pre-stack data enhancement is achieved, improving the signal-to-noise ratio (S/N) of interpolated and recorded traces. The wave-front attributes are determined in a data-driven fashion, for example, with the Common Reflection Surface (CRS method). As one of the wave-front-attribute-based interpolation techniques, the 3-D partial CRS method was proposed to enhance the quality of 3-D pre-stack data with low S/N. In the past work on 3-D partial stacks, two potential problems were still unsolved. For high-quality wave-front attributes, we suggest a global optimization strategy instead of the so far used pragmatic search approach. In previous works, the interpolation of 3-D data was performed along a specific azimuth which is acceptable for narrow azimuth acquisition but does not exploit the potential of wide-, rich- or full-azimuth acquisitions. The conventional 3-D partial CRS method is improved in this work and we call it as a wave-front-attribute-based 5-D interpolation (5-D WABI) as the two problems mentioned above are addressed. Data examples demonstrate the improved performance by the 5-D WABI method when compared with the conventional 3-D partial CRS approach. A comparison of the rank-reduction-based 5-D seismic interpolation technique with the proposed 5-D WABI method is given. The comparison reveals that there are significant advantages for steep dipping events using the 5-D WABI method when compared to the rank-reduction-based 5-D interpolation technique. Diffraction tails substantially benefit from this improved performance of the partial CRS stacking approach while the CPU time is comparable to the CPU time consumed by the rank-reduction-based method.

  1. Outcome and Efficacy of Interventions by a Public Figure Threat Assessment and Management Unit: A Mirrored Study of Concerning Behaviors and Police Contacts Before and After Intervention.

    PubMed

    James, David V; Farnham, Frank R

    2016-09-01

    Specialized units for the assessment and management of concerning behaviors towards public figures have been set up in various jurisdictions. Their efficacy has been demonstrated descriptively and in terms of reduction in concern rates. This study of 100 consecutive cases from the Fixated Threat Assessment Centre (FTAC) in the UK uses a novel measure of outcome in the form of reduction in behaviors of concern and in police call-outs/stops, using data culled from police and health service records. It adopts a mirrored design, comparing individuals over 12-month and 2-year periods before and after FTAC intervention. It demonstrates significant reductions in both numbers of individuals involved in, and number of actual incidents of, concerning communication and problematic approach, as well as police call-outs/stops. Most results are consistent across subgroups with regard to gender, previous convictions, concern level, compulsory hospitalization and grievance-driven behavior. Such threat assessment units reduce risky behavior and save police time and, possibly, costs. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. The impact of uncertainty on optimal emission policies

    NASA Astrophysics Data System (ADS)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  3. A novel curvilinear approach for prostate seed implantation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Podder, Tarun K.; Dicker, Adam P.; Hutapea, Parsaoran

    Purpose: A new technique called ''curvilinear approach'' for prostate seed implantation has been proposed. The purpose of this study is to evaluate the dosimetric benefit of curvilinear distribution of seeds for low-dose-rate (LDR) prostate brachytherapy. Methods: Twenty LDR prostate brachytherapy cases planned intraoperatively with VariSeed planning system and I-125 seeds were randomly selected as reference rectilinear cases. All the cases were replanned by using curved-needle approach keeping the same individual source strength and the volume receiving 100% of prescribed dose 145 Gy (V{sub 100}). Parameters such as number of needles, seeds, and the dose coverage of the prostate (D{sub 90},more » V{sub 150}, V{sub 200}), urethra (D{sub 30}, D{sub 10}) and rectum (D{sub 5}, V{sub 100}) were compared for the rectilinear and the curvilinear methods. Statistical significance was assessed using two-tailed student's t-test. Results: Reduction of the required number of needles and seeds in curvilinear method were 30.5% (p < 0.001) and 11.8% (p < 0.49), respectively. Dose to the urethra was reduced significantly; D{sub 30} reduced by 10.1% (p < 0.01) and D{sub 10} reduced by 9.9% (p < 0.02). Reduction in rectum dose D{sub 5} was 18.5% (p < 0.03) and V{sub 100} was also reduced from 0.93 cc in rectilinear to 0.21 cc in curvilinear (p < 0.001). Also the V{sub 150} and V{sub 200} coverage of prostate reduced by 18.8% (p < 0.01) and 33.9% (p < 0.001), respectively. Conclusions: Significant improvement in the relevant dosimetric parameters was observed in curvilinear needle approach. Prostate dose homogeneity (V{sub 150}, V{sub 200}) improved while urethral dose was reduced, which might potentially result in better treatment outcome. Reduction in rectal dose could potentially reduce rectal toxicity and complications. Reduction in number of needles would minimize edema and thereby could improve postimplant urinary incontinence. This study indicates that the curvilinear implantation approach is dosimetrically superior to conventional rectilinear implantation technique.« less

  4. Dual energy CT: How well can pseudo-monochromatic imaging reduce metal artifacts?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuchenbecker, Stefan, E-mail: stefan.kuchenbecker@dkfz.de; Faby, Sebastian; Sawall, Stefan

    2015-02-15

    Purpose: Dual Energy CT (DECT) provides so-called monoenergetic images based on a linear combination of the original polychromatic images. At certain patient-specific energy levels, corresponding to certain patient- and slice-dependent linear combination weights, e.g., E = 160 keV corresponds to α = 1.57, a significant reduction of metal artifacts may be observed. The authors aimed at analyzing the method for its artifact reduction capabilities to identify its limitations. The results are compared with raw data-based processing. Methods: Clinical DECT uses a simplified version of monochromatic imaging by linearly combining the low and the high kV images and by assigning an energymore » to that linear combination. Those pseudo-monochromatic images can be used by radiologists to obtain images with reduced metal artifacts. The authors analyzed the underlying physics and carried out a series expansion of the polychromatic attenuation equations. The resulting nonlinear terms are responsible for the artifacts, but they are not linearly related between the low and the high kV scan: A linear combination of both images cannot eliminate the nonlinearities, it can only reduce their impact. Scattered radiation yields additional noncanceling nonlinearities. This method is compared to raw data-based artifact correction methods. To quantify the artifact reduction potential of pseudo-monochromatic images, they simulated the FORBILD abdomen phantom with metal implants, and they assessed patient data sets of a clinical dual source CT system (100, 140 kV Sn) containing artifacts induced by a highly concentrated contrast agent bolus and by metal. In each case, they manually selected an optimal α and compared it to a raw data-based material decomposition in case of simulation, to raw data-based material decomposition of inconsistent rays in case of the patient data set containing contrast agent, and to the frequency split normalized metal artifact reduction in case of the metal implant. For each case, the contrast-to-noise ratio (CNR) was assessed. Results: In the simulation, the pseudo-monochromatic images yielded acceptable artifact reduction results. However, the CNR in the artifact-reduced images was more than 60% lower than in the original polychromatic images. In contrast, the raw data-based material decomposition did not significantly reduce the CNR in the virtual monochromatic images. Regarding the patient data with beam hardening artifacts and with metal artifacts from small implants the pseudo-monochromatic method was able to reduce the artifacts, again with the downside of a significant CNR reduction. More intense metal artifacts, e.g., as those caused by an artificial hip joint, could not be suppressed. Conclusions: Pseudo-monochromatic imaging is able to reduce beam hardening, scatter, and metal artifacts in some cases but it cannot remove them. In all cases, the CNR is significantly reduced, thereby rendering the method questionable, unless special post processing algorithms are implemented to restore the high CNR from the original images (e.g., by using a frequency split technique). Raw data-based dual energy decomposition methods should be preferred, in particular, because the CNR penalty is almost negligible.« less

  5. An Energy-Efficient Multi-Tier Architecture for Fall Detection Using Smartphones.

    PubMed

    Guvensan, M Amac; Kansiz, A Oguz; Camgoz, N Cihan; Turkmen, H Irem; Yavuz, A Gokhan; Karsligil, M Elif

    2017-06-23

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions.

  6. ParticleCall: A particle filter for base calling in next-generation sequencing systems

    PubMed Central

    2012-01-01

    Background Next-generation sequencing systems are capable of rapid and cost-effective DNA sequencing, thus enabling routine sequencing tasks and taking us one step closer to personalized medicine. Accuracy and lengths of their reads, however, are yet to surpass those provided by the conventional Sanger sequencing method. This motivates the search for computationally efficient algorithms capable of reliable and accurate detection of the order of nucleotides in short DNA fragments from the acquired data. Results In this paper, we consider Illumina’s sequencing-by-synthesis platform which relies on reversible terminator chemistry and describe the acquired signal by reformulating its mathematical model as a Hidden Markov Model. Relying on this model and sequential Monte Carlo methods, we develop a parameter estimation and base calling scheme called ParticleCall. ParticleCall is tested on a data set obtained by sequencing phiX174 bacteriophage using Illumina’s Genome Analyzer II. The results show that the developed base calling scheme is significantly more computationally efficient than the best performing unsupervised method currently available, while achieving the same accuracy. Conclusions The proposed ParticleCall provides more accurate calls than the Illumina’s base calling algorithm, Bustard. At the same time, ParticleCall is significantly more computationally efficient than other recent schemes with similar performance, rendering it more feasible for high-throughput sequencing data analysis. Improvement of base calling accuracy will have immediate beneficial effects on the performance of downstream applications such as SNP and genotype calling. ParticleCall is freely available at https://sourceforge.net/projects/particlecall. PMID:22776067

  7. Selective Reduction: "A Soft Cover for Hard Choices" or Another Name for Abortion?

    PubMed

    Rao, Radhika

    2015-01-01

    Selective reduction and abortion both involve the termination of fetal life, but they are classified by different designations to underscore the notion that they are regarded as fundamentally different medical procedures: the two are performed using distinct techniques by different types of physicians, upon women under very different circumstances, in order to further dramatically different objectives. Hence, the two procedures appear to call for a distinct moral calculus, and they have traditionally evoked contradictory reactions from society. This essay posits that despite their different appellations, selective reduction and abortion are essentially equivalent. © 2015 American Society of Law, Medicine & Ethics, Inc.

  8. Applying a Consumer Behavior Lens to Salt Reduction Initiatives.

    PubMed

    Regan, Áine; Kent, Monique Potvin; Raats, Monique M; McConnon, Áine; Wall, Patrick; Dubois, Lise

    2017-08-18

    Reformulation of food products to reduce salt content has been a central strategy for achieving population level salt reduction. In this paper, we reflect on current reformulation strategies and consider how consumer behavior determines the ultimate success of these strategies. We consider the merits of adopting a 'health by stealth', silent approach to reformulation compared to implementing a communications strategy which draws on labeling initiatives in tandem with reformulation efforts. We end this paper by calling for a multi-actor approach which utilizes co-design, participatory tools to facilitate the involvement of all stakeholders, including, and especially, consumers, in making decisions around how best to achieve population-level salt reduction.

  9. Differential correlation for sequencing data.

    PubMed

    Siska, Charlotte; Kechris, Katerina

    2017-01-19

    Several methods have been developed to identify differential correlation (DC) between pairs of molecular features from -omics studies. Most DC methods have only been tested with microarrays and other platforms producing continuous and Gaussian-like data. Sequencing data is in the form of counts, often modeled with a negative binomial distribution making it difficult to apply standard correlation metrics. We have developed an R package for identifying DC called Discordant which uses mixture models for correlations between features and the Expectation Maximization (EM) algorithm for fitting parameters of the mixture model. Several correlation metrics for sequencing data are provided and tested using simulations. Other extensions in the Discordant package include additional modeling for different types of differential correlation, and faster implementation, using a subsampling routine to reduce run-time and address the assumption of independence between molecular feature pairs. With simulations and breast cancer miRNA-Seq and RNA-Seq data, we find that Spearman's correlation has the best performance among the tested correlation methods for identifying differential correlation. Application of Spearman's correlation in the Discordant method demonstrated the most power in ROC curves and sensitivity/specificity plots, and improved ability to identify experimentally validated breast cancer miRNA. We also considered including additional types of differential correlation, which showed a slight reduction in power due to the additional parameters that need to be estimated, but more versatility in applications. Finally, subsampling within the EM algorithm considerably decreased run-time with negligible effect on performance. A new method and R package called Discordant is presented for identifying differential correlation with sequencing data. Based on comparisons with different correlation metrics, this study suggests Spearman's correlation is appropriate for sequencing data, but other correlation metrics are available to the user depending on the application and data type. The Discordant method can also be extended to investigate additional DC types and subsampling with the EM algorithm is now available for reduced run-time. These extensions to the R package make Discordant more robust and versatile for multiple -omics studies.

  10. Effects of band selection on endmember extraction for forestry applications

    NASA Astrophysics Data System (ADS)

    Karathanassi, Vassilia; Andreou, Charoula; Andronis, Vassilis; Kolokoussis, Polychronis

    2014-10-01

    In spectral unmixing theory, data reduction techniques play an important role as hyperspectral imagery contains an immense amount of data, posing many challenging problems such as data storage, computational efficiency, and the so called "curse of dimensionality". Feature extraction and feature selection are the two main approaches for dimensionality reduction. Feature extraction techniques are used for reducing the dimensionality of the hyperspectral data by applying transforms on hyperspectral data. Feature selection techniques retain the physical meaning of the data by selecting a set of bands from the input hyperspectral dataset, which mainly contain the information needed for spectral unmixing. Although feature selection techniques are well-known for their dimensionality reduction potentials they are rarely used in the unmixing process. The majority of the existing state-of-the-art dimensionality reduction methods set criteria to the spectral information, which is derived by the whole wavelength, in order to define the optimum spectral subspace. These criteria are not associated with any particular application but with the data statistics, such as correlation and entropy values. However, each application is associated with specific land c over materials, whose spectral characteristics present variations in specific wavelengths. In forestry for example, many applications focus on tree leaves, in which specific pigments such as chlorophyll, xanthophyll, etc. determine the wavelengths where tree species, diseases, etc., can be detected. For such applications, when the unmixing process is applied, the tree species, diseases, etc., are considered as the endmembers of interest. This paper focuses on investigating the effects of band selection on the endmember extraction by exploiting the information of the vegetation absorbance spectral zones. More precisely, it is explored whether endmember extraction can be optimized when specific sets of initial bands related to leaf spectral characteristics are selected. Experiments comprise application of well-known signal subspace estimation and endmember extraction methods on a hyperspectral imagery that presents a forest area. Evaluation of the extracted endmembers showed that more forest species can be extracted as endmembers using selected bands.

  11. Universal Industrial Solution and Industrial Sectors Module for Pulp and Paper Sector

    EPA Science Inventory

    Increased concerns over air pollution and its detrimental effects on health have called for more stringent emission reduction strategies in the industrial sector. However, stringent regulatory policies can potentially adversely affect domestic and international trade. Therefore E...

  12. Evaluation of safe performance secondary school driver education curriculum demonstration project

    DOT National Transportation Integrated Search

    1983-06-01

    The primary objective of this Project was to determine the crash reduction potential of a quality, competency-based driver training program known as the Safe Performance Curriculum (SPC). The experimental design called for the random assignment of 18...

  13. Genetics Home Reference: dopa-responsive dystonia

    MedlinePlus

    ... neurotransmitters called dopamine and serotonin. Among their many functions, dopamine transmits signals within the brain to produce smooth ... production of a tyrosine hydroxylase enzyme with reduced function, which leads to a decrease in dopamine production. A reduction in the amount of dopamine ...

  14. Efficient computation of parameter sensitivities of discrete stochastic chemical reaction networks.

    PubMed

    Rathinam, Muruhan; Sheppard, Patrick W; Khammash, Mustafa

    2010-01-21

    Parametric sensitivity of biochemical networks is an indispensable tool for studying system robustness properties, estimating network parameters, and identifying targets for drug therapy. For discrete stochastic representations of biochemical networks where Monte Carlo methods are commonly used, sensitivity analysis can be particularly challenging, as accurate finite difference computations of sensitivity require a large number of simulations for both nominal and perturbed values of the parameters. In this paper we introduce the common random number (CRN) method in conjunction with Gillespie's stochastic simulation algorithm, which exploits positive correlations obtained by using CRNs for nominal and perturbed parameters. We also propose a new method called the common reaction path (CRP) method, which uses CRNs together with the random time change representation of discrete state Markov processes due to Kurtz to estimate the sensitivity via a finite difference approximation applied to coupled reaction paths that emerge naturally in this representation. While both methods reduce the variance of the estimator significantly compared to independent random number finite difference implementations, numerical evidence suggests that the CRP method achieves a greater variance reduction. We also provide some theoretical basis for the superior performance of CRP. The improved accuracy of these methods allows for much more efficient sensitivity estimation. In two example systems reported in this work, speedup factors greater than 300 and 10,000 are demonstrated.

  15. Recent advances in reduction methods for nonlinear problems. [in structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1981-01-01

    Status and some recent developments in the application of reduction methods to nonlinear structural mechanics problems are summarized. The aspects of reduction methods discussed herein include: (1) selection of basis vectors in nonlinear static and dynamic problems, (2) application of reduction methods in nonlinear static analysis of structures subjected to prescribed edge displacements, and (3) use of reduction methods in conjunction with mixed finite element models. Numerical examples are presented to demonstrate the effectiveness of reduction methods in nonlinear problems. Also, a number of research areas which have high potential for application of reduction methods are identified.

  16. Crowd Sourcing to Improve Urban Stormwater Management

    NASA Astrophysics Data System (ADS)

    Minsker, B. S.; Band, L. E.; Heidari Haratmeh, B.; Law, N. L.; Leonard, L. N.; Rai, A.

    2017-12-01

    Over half of the world's population currently lives in urban areas, a number predicted to grow to 60 percent by 2030. Urban areas face unprecedented and growing challenges that threaten society's long-term wellbeing, including poverty; chronic health problems; widespread pollution and resource degradation; and increased natural disasters. These are "wicked" problems involving "systems of systems" that require unprecedented information sharing and collaboration across disciplines and organizational boundaries. Cities are recognizing that the increasing stream of data and information ("Big Data"), informatics, and modeling can support rapid advances on these challenges. Nonetheless, information technology solutions can only be effective in addressing these challenges through deeply human and systems perspectives. A stakeholder-driven approach ("crowd sourcing") is needed to develop urban systems that address multiple needs, such as parks that capture and treat stormwater while improving human and ecosystem health and wellbeing. We have developed informatics- and Cloud-based collaborative methods that enable crowd sourcing of green stormwater infrastructure (GSI: rain gardens, bioswales, trees, etc.) design and management. The methods use machine learning, social media data, and interactive design tools (called IDEAS-GI) to identify locations and features of GSI that perform best on a suite of objectives, including life cycle cost, stormwater volume reduction, and air pollution reduction. Insights will be presented on GI features that best meet stakeholder needs and are therefore most likely to improve human wellbeing and be well maintained.

  17. Best practices for evaluating single nucleotide variant calling methods for microbial genomics

    PubMed Central

    Olson, Nathan D.; Lund, Steven P.; Colman, Rebecca E.; Foster, Jeffrey T.; Sahl, Jason W.; Schupp, James M.; Keim, Paul; Morrow, Jayne B.; Salit, Marc L.; Zook, Justin M.

    2015-01-01

    Innovations in sequencing technologies have allowed biologists to make incredible advances in understanding biological systems. As experience grows, researchers increasingly recognize that analyzing the wealth of data provided by these new sequencing platforms requires careful attention to detail for robust results. Thus far, much of the scientific Communit’s focus for use in bacterial genomics has been on evaluating genome assembly algorithms and rigorously validating assembly program performance. Missing, however, is a focus on critical evaluation of variant callers for these genomes. Variant calling is essential for comparative genomics as it yields insights into nucleotide-level organismal differences. Variant calling is a multistep process with a host of potential error sources that may lead to incorrect variant calls. Identifying and resolving these incorrect calls is critical for bacterial genomics to advance. The goal of this review is to provide guidance on validating algorithms and pipelines used in variant calling for bacterial genomics. First, we will provide an overview of the variant calling procedures and the potential sources of error associated with the methods. We will then identify appropriate datasets for use in evaluating algorithms and describe statistical methods for evaluating algorithm performance. As variant calling moves from basic research to the applied setting, standardized methods for performance evaluation and reporting are required; it is our hope that this review provides the groundwork for the development of these standards. PMID:26217378

  18. The overexpression of RXam1, a cassava gene coding for an RLK, confers disease resistance to Xanthomonas axonopodis pv. manihotis.

    PubMed

    Díaz Tatis, Paula A; Herrera Corzo, Mariana; Ochoa Cabezas, Juan C; Medina Cipagauta, Adriana; Prías, Mónica A; Verdier, Valerie; Chavarriaga Aguirre, Paul; López Carrascal, Camilo E

    2018-04-01

    The overexpression of RXam1 leads to a reduction in bacterial growth of XamCIO136, suggesting that RXam1 might be implicated in strain-specific resistance. Cassava bacterial blight (CBB) caused by Xanthomonas axonopodis pv. manihotis (Xam) is a prevalent disease in all regions, where cassava is cultivated. CBB is a foliar and vascular disease usually controlled through host resistance. Previous studies have found QTLs explaining resistance to several Xam strains. Interestingly, one QTL called XM5 that explained 13% of resistance to XamCIO136 was associated with a similar fragment of the rice Xa21-resistance gene called PCR250. In this study, we aimed to further identify and characterize this fragment and its role in resistance to CBB. Screening and hybridization of a BAC library using the molecular marker PCR250 as a probe led to the identification of a receptor-like kinase similar to Xa21 and were called RXam1 (Resistance to Xam 1). Here, we report the functional characterization of susceptible cassava plants overexpressing RXam1. Our results indicated that the overexpression of RXam1 leads to a reduction in bacterial growth of XamCIO136. This suggests that RXAM1 might be implicated in strain-specific resistance to XamCIO136.

  19. Modeling Complex Chemical Systems: Problems and Solutions

    NASA Astrophysics Data System (ADS)

    van Dijk, Jan

    2016-09-01

    Non-equilibrium plasmas in complex gas mixtures are at the heart of numerous contemporary technologies. They typically contain dozens to hundreds of species, involved in hundreds to thousands of reactions. Chemists and physicists have always been interested in what are now called chemical reduction techniques (CRT's). The idea of such CRT's is that they reduce the number of species that need to be considered explicitly without compromising the validity of the model. This is usually achieved on the basis of an analysis of the reaction time scales of the system under study, which identifies species that are in partial equilibrium after a given time span. The first such CRT that has been widely used in plasma physics was developed in the 1960's and resulted in the concept of effective ionization and recombination rates. It was later generalized to systems in which multiple levels are effected by transport. In recent years there has been a renewed interest in tools for chemical reduction and reaction pathway analysis. An example of the latter is the PumpKin tool. Another trend is that techniques that have previously been developed in other fields of science are adapted as to be able to handle the plasma state of matter. Examples are the Intrinsic Low Dimension Manifold (ILDM) method and its derivatives, which originate from combustion engineering, and the general-purpose Principle Component Analysis (PCA) technique. In this contribution we will provide an overview of the most common reduction techniques, then critically assess the pros and cons of the methods that have gained most popularity in recent years. Examples will be provided for plasmas in argon and carbon dioxide.

  20. Similarity-balanced discriminant neighbor embedding and its application to cancer classification based on gene expression data.

    PubMed

    Zhang, Li; Qian, Liqiang; Ding, Chuntao; Zhou, Weida; Li, Fanzhang

    2015-09-01

    The family of discriminant neighborhood embedding (DNE) methods is typical graph-based methods for dimension reduction, and has been successfully applied to face recognition. This paper proposes a new variant of DNE, called similarity-balanced discriminant neighborhood embedding (SBDNE) and applies it to cancer classification using gene expression data. By introducing a novel similarity function, SBDNE deals with two data points in the same class and the different classes with different ways. The homogeneous and heterogeneous neighbors are selected according to the new similarity function instead of the Euclidean distance. SBDNE constructs two adjacent graphs, or between-class adjacent graph and within-class adjacent graph, using the new similarity function. According to these two adjacent graphs, we can generate the local between-class scatter and the local within-class scatter, respectively. Thus, SBDNE can maximize the between-class scatter and simultaneously minimize the within-class scatter to find the optimal projection matrix. Experimental results on six microarray datasets show that SBDNE is a promising method for cancer classification. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Deciphering the complex: methodological overview of statistical models to derive OMICS-based biomarkers.

    PubMed

    Chadeau-Hyam, Marc; Campanella, Gianluca; Jombart, Thibaut; Bottolo, Leonardo; Portengen, Lutzen; Vineis, Paolo; Liquet, Benoit; Vermeulen, Roel C H

    2013-08-01

    Recent technological advances in molecular biology have given rise to numerous large-scale datasets whose analysis imposes serious methodological challenges mainly relating to the size and complex structure of the data. Considerable experience in analyzing such data has been gained over the past decade, mainly in genetics, from the Genome-Wide Association Study era, and more recently in transcriptomics and metabolomics. Building upon the corresponding literature, we provide here a nontechnical overview of well-established methods used to analyze OMICS data within three main types of regression-based approaches: univariate models including multiple testing correction strategies, dimension reduction techniques, and variable selection models. Our methodological description focuses on methods for which ready-to-use implementations are available. We describe the main underlying assumptions, the main features, and advantages and limitations of each of the models. This descriptive summary constitutes a useful tool for driving methodological choices while analyzing OMICS data, especially in environmental epidemiology, where the emergence of the exposome concept clearly calls for unified methods to analyze marginally and jointly complex exposure and OMICS datasets. Copyright © 2013 Wiley Periodicals, Inc.

  2. Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.

    PubMed

    Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed

    2018-01-01

    The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.

  3. Sequence-independent construction of ordered combinatorial libraries with predefined crossover points.

    PubMed

    Jézéquel, Laetitia; Loeper, Jacqueline; Pompon, Denis

    2008-11-01

    Combinatorial libraries coding for mosaic enzymes with predefined crossover points constitute useful tools to address and model structure-function relationships and for functional optimization of enzymes based on multivariate statistics. The presented method, called sequence-independent generation of a chimera-ordered library (SIGNAL), allows easy shuffling of any predefined amino acid segment between two or more proteins. This method is particularly well adapted to the exchange of protein structural modules. The procedure could also be well suited to generate ordered combinatorial libraries independent of sequence similarities in a robotized manner. Sequence segments to be recombined are first extracted by PCR from a single-stranded template coding for an enzyme of interest using a biotin-avidin-based method. This technique allows the reduction of parental template contamination in the final library. Specific PCR primers allow amplification of two complementary mosaic DNA fragments, overlapping in the region to be exchanged. Fragments are finally reassembled using a fusion PCR. The process is illustrated via the construction of a set of mosaic CYP2B enzymes using this highly modular approach.

  4. On-Orbit Demonstration of a Lithium-Ion Capacitor and Thin-Film Multijunction Solar Cells

    NASA Astrophysics Data System (ADS)

    Kukita, Akio; Takahashi, Masato; Shimazaki, Kazunori; Kobayashi, Yuki; Sakai, Tomohiko; Toyota, Hiroyuki; Takahashi, Yu; Murashima, Mio; Uno, Masatoshi; Imaizumi, Mitsuru

    2014-08-01

    This paper describes an on-orbit demonstration of the Next-generation Small Satellite Instrument for Electric power systems (NESSIE) on which an aluminum- laminated lithium-ion capacitor (LIC) and a lightweight solar panel called KKM-PNL, which has space solar sheets using thin-film multijunction solar cells, were installed. The flight data examined in this paper covers a period of 143 days from launch. We verified the integrity of an LIC constructed using a simple and lightweight mounting method: no significant capacitance reduction was observed. We also confirmed that inverted metamorphic multijunction triple-junction thin-film solar cells used for evaluation were healthy at 143 days after launch, because their degradation almost matched the degradation predictions for dual-junction thin-film solar cells.

  5. Photobiomodulation (PBM) with 20 W at 640 nm: pre-clinical results and propagation model

    NASA Astrophysics Data System (ADS)

    Gendron, Denis J.; Ménage, Alexander R.

    2017-02-01

    A novel treatment modality for photobiomodulation (PBM) is introduced called High Intensity Physio Light (HIPL) Therapy with a light source at 640 nm wavelength, 20 nm bandwidth, and up to 20 W in large 10 cm flat beam. This report exemplifies the efficacy performance of this method with three pre-clinical cases: (i) ankle: sport injury, (ii) foot: bone fractures, and (iii) shoulder: musculoskeletal disorder (MSD). In all cases, the patients systematically experienced a significant pain reduction (by 2 / 10 - 4 / 10) on a visual pain scale. In case (ii) and (iii), a steady improvement and complete recovery of the patient was respectfully obtained. This report describes the experimental treatment condition for each case, and introduces an intensity-dependant propagation model to explain our observation.

  6. Multi-dimensional photonic states from a quantum dot

    NASA Astrophysics Data System (ADS)

    Lee, J. P.; Bennett, A. J.; Stevenson, R. M.; Ellis, D. J. P.; Farrer, I.; Ritchie, D. A.; Shields, A. J.

    2018-04-01

    Quantum states superposed across multiple particles or degrees of freedom offer an advantage in the development of quantum technologies. Creating these states deterministically and with high efficiency is an ongoing challenge. A promising approach is the repeated excitation of multi-level quantum emitters, which have been shown to naturally generate light with quantum statistics. Here we describe how to create one class of higher dimensional quantum state, a so called W-state, which is superposed across multiple time bins. We do this by repeated Raman scattering of photons from a charged quantum dot in a pillar microcavity. We show this method can be scaled to larger dimensions with no reduction in coherence or single-photon character. We explain how to extend this work to enable the deterministic creation of arbitrary time-bin encoded qudits.

  7. RCT of a brief phone-based CBT intervention to improve PTSD treatment utilization by returning service members.

    PubMed

    Stecker, Tracy; McHugo, Gregory; Xie, Haiyi; Whyman, Katrina; Jones, Meissa

    2014-10-01

    Many service members do not seek care for mental health and addiction problems, often with serious consequences for them, their families, and their communities. This study tested the effectiveness of a brief, telephone-based, cognitive-behavioral intervention designed to improve treatment engagement among returning service members who screened positive for posttraumatic stress disorder (PTSD). Service members who had served in Operation Enduring Freedom or Operation Iraqi Freedom who screened positive for PTSD but had not engaged in PTSD treatment were recruited (N=300), randomly assigned to either control or intervention conditions, and administered a baseline interview. Intervention participants received a brief cognitive-behavioral therapy intervention; participants in the control condition had access to usual services. All participants received follow-up phone calls at months 1, 3, and 6 to assess symptoms and service utilization. Participants in both conditions had comparable rates of treatment engagement and PTSD symptom reduction over the course of the six-month trial, but receiving the telephone-based intervention accelerated service utilization (treatment engagement and number of sessions) and PTSD symptom reduction. A one-time brief telephone intervention can engage service members in PTSD treatment earlier than conventional methods and can lead to immediate symptom reduction. There were no differences at longer-term follow-up, suggesting the need for additional intervention to build upon initial gains.

  8. Tracking fin whales in the northeast Pacific Ocean with a seafloor seismic network.

    PubMed

    Wilcock, William S D

    2012-10-01

    Ocean bottom seismometer (OBS) networks represent a tool of opportunity to study fin and blue whales. A small OBS network on the Juan de Fuca Ridge in the northeast Pacific Ocean in ~2.3 km of water recorded an extensive data set of 20-Hz fin whale calls. An automated method has been developed to identify arrival times based on instantaneous frequency and amplitude and to locate calls using a grid search even in the presence of a few bad arrival times. When only one whale is calling near the network, tracks can generally be obtained up to distances of ~15 km from the network. When the calls from multiple whales overlap, user supervision is required to identify tracks. The absolute and relative amplitudes of arrivals and their three-component particle motions provide additional constraints on call location but are not useful for extending the distance to which calls can be located. The double-difference method inverts for changes in relative call locations using differences in residuals for pairs of nearby calls recorded on a common station. The method significantly reduces the unsystematic component of the location error, especially when inconsistencies in arrival time observations are minimized by cross-correlation.

  9. Gene variant linked to lung cancer risk

    Cancer.gov

    A variation of the gene NFKB1, called rs4648127, is associated with an estimated 44 percent reduction in lung cancer risk. When this information, derived from samples obtained as part of a large NCI-sponsored prevention clinical trial, was compared with d

  10. Climate change : U.S. federal laws and policies related to greenhouse gas reductions

    DOT National Transportation Integrated Search

    2006-02-22

    Climate change is generally viewed as a global issue, but proposed responses generally require action at the national level. In 1992, the United States ratified the United Nations Framework Convention on Climate Change (UNFCCC), which called on in...

  11. 75 FR 39573 - Notice of Proposed Information Collection; Comment Request (Economic Opportunities for Low- and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-09

    ... number). Hearing or speech-impaired individuals may access this number TTY by calling the toll-free.... Authority: The Paperwork Reduction Act of 1995, 44 U.S.C. Chapter 35, as amended. Dated: July 1, 2010. Staci...

  12. 47 CFR 80.225 - Requirements for selective calling equipment.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... selective calling (DSC) equipment and selective calling equipment installed in ship and coast stations, and...-STD, “RTCM Recommended Minimum Standards for Digital Selective Calling (DSC) Equipment Providing... Class ‘D’ Digital Selective Calling (DSC)—Methods of testing and required test results,” March 2003. ITU...

  13. Physics of Traffic Flow

    NASA Astrophysics Data System (ADS)

    Davis, L. C.

    2015-03-01

    The Texas A&M Transportation Institute estimated that traffic congestion cost the United States 121 billion in 2011 (the latest data available). The cost is due to wasted time and fuel. In addition to accidents and road construction, factors contributing to congestion include large demand, instability of high-density free flow and selfish behavior of drivers, which produces self-organized traffic bottlenecks. Extensive data collected on instrumented highways in various countries have led to a better understanding of traffic dynamics. From these measurements, Boris Kerner and colleagues developed a new theory called three-phase theory. They identified three major phases of flow observed in the data: free flow, synchronous flow and wide moving jams. The intermediate phase is called synchronous because vehicles in different lanes tend to have similar velocities. This congested phase, characterized by lower velocities yet modestly high throughput, frequently occurs near on-ramps and lane reductions. At present there are only two widely used methods of congestion mitigation: ramp metering and the display of current travel-time information to drivers. To find more effective methods to reduce congestion, researchers perform large-scale simulations using models based on the new theories. An algorithm has been proposed to realize Wardrop equilibria with real-time route information. Such equilibria have equal travel time on alternative routes between a given origin and destination. An active area of current research is the dynamics of connected vehicles, which communicate wirelessly with other vehicles and the surrounding infrastructure. These systems show great promise for improving traffic flow and safety.

  14. Closed-loop bird-computer interactions: a new method to study the role of bird calls.

    PubMed

    Lerch, Alexandre; Roy, Pierre; Pachet, François; Nagle, Laurent

    2011-03-01

    In the field of songbird research, many studies have shown the role of male songs in territorial defense and courtship. Calling, another important acoustic communication signal, has received much less attention, however, because calls are assumed to contain less information about the emitter than songs do. Birdcall repertoire is diverse, and the role of calls has been found to be significant in the area of social interaction, for example, in pair, family, and group cohesion. However, standard methods for studying calls do not allow precise and systematic study of their role in communication. We propose herein a new method to study bird vocal interaction. A closed-loop computer system interacts with canaries, Serinus canaria, by (1) automatically classifying two basic types of canary vocalization, single versus repeated calls, as they are produced by the subject, and (2) responding with a preprogrammed call type recorded from another bird. This computerized animal-machine interaction requires no human interference. We show first that the birds do engage in sustained interactions with the system, by studying the rate of single and repeated calls for various programmed protocols. We then show that female canaries differentially use single and repeated calls. First, they produce significantly more single than repeated calls, and second, the rate of single calls is associated with the context in which they interact, whereas repeated calls are context independent. This experiment is the first illustration of how closed-loop bird-computer interaction can be used productively to study social relationships. © Springer-Verlag 2010

  15. Effects of night-time on-call work on heart rate variability before bed and sleep quality in visiting nurses.

    PubMed

    Kikuchi, Yukiko; Ishii, Noriko; Kodama, Hideya

    2018-05-28

    In Japan, many visiting nurses work carrying cell phones to respond to calls from users even at night (on-call work). The purpose of this study was to investigate whether on-call work affected heart rate variability (HRV) before bed and decreased sleep quality in visiting nurses even if their sleep was not interrupted due to actual calls. Thirty-one visiting nurses (mean age, 49.8 years; standard deviation, 6.3 years) were asked to record their 2.5-min resting HRV before bed, and to undergo one-channel sleep electroencephalography (EEG) and subjective sleep evaluations upon waking (Oguri, Shirakawa, and Azumi Sleep Inventory) at home for 4-5 consecutive days, including both on-call and non-on-call days. Paired data sets of outcome measures, including HRV parameters, sleep macrostructure variables, and subjective sleep quality scores between on-call and non-on-call days were compared; the most recent measurements for each category were used for each subject. There were no differences in HRV measures and objective sleep EEG variables. A significant increase in "sleepiness on rising" and a decrease in "feeling refreshed" were observed on on-call days (P = 0.019 and 0.021, respectively), and younger subjects (≤ 51 years old) demonstrated a significant reduction in "sleepiness on rising" (significant interaction effect, P = 0.029). Adverse effects of on-call work on sleep quality in most visiting nurses are thought to be subjective, and relatively young nurses tend to notice a decrease in sleep quality. On-call work itself does not appear to be a substantial stressor that could affect HRV and sleep structure.

  16. Use of mobile and cordless phones and change in cognitive function: a prospective cohort analysis of Australian primary school children.

    PubMed

    Bhatt, Chhavi Raj; Benke, Geza; Smith, Catherine L; Redmayne, Mary; Dimitriadis, Christina; Dalecki, Anna; Macleod, Skye; Sim, Malcolm R; Croft, Rodney J; Wolfe, Rory; Kaufman, Jordy; Abramson, Michael J

    2017-06-19

    Some previous studies have suggested an association between children's use of mobile phones (MPs)/cordless phones (CPs) and development of cognitive function. We evaluated possible longitudinal associations between the use of MPs and CPs in a cohort of primary school children and effects on their cognitive function. Data on children's socio-demographics, use of MPs and CPs, and cognitive function were collected at baseline (2010-2012) and follow-up (2012-2013). Cognitive outcomes were evaluated with the CogHealth™ test battery and Stroop Color-Word test. The change in the number of MP/CP voice calls weekly from baseline to follow-up was dichotomized: "an increase in calls" or a "decrease/no change in calls". Multiple linear regression analyses, adjusting for confounders and clustering by school, were performed to evaluate the associations between the change in cognitive outcomes and change in MP and CP exposures. Of 412 children, a larger proportion of them used a CP (76% at baseline and follow-up), compared to a MP (31% at baseline and 43% at follow-up). Of 26 comparisons of changes in cognitive outcomes, four demonstrated significant associations. The increase in MP usage was associated with larger reduction in response time for response inhibition, smaller reduction in the number of total errors for spatial problem solving and larger increase in response time for a Stroop interference task. Except for the smaller reduction in detection task accuracy, the increase in CP usage had no effect on the changes in cognitive outcomes. Our study shows that a larger proportion of children used CPs compared to MPs. We found limited evidence that change in the use of MPs or CPs in primary school children was associated with change in cognitive function.

  17. American fuel cell market development

    NASA Astrophysics Data System (ADS)

    Gillis, E. A.

    1992-01-01

    Over the past three decades several attempts have been made to introduce fuel cells into commercial markets. The prospective users recognized the attractive features of fuel cells, however they were unwilling to pay a premium for the features other than the easily-calculated fuel cost savings. There was no accepted method for a user to calculate and the accrue the economic value of the other features. The situation is changing. The Clean Air Act signed into law by President Bush on November 15, 1990, mandates a nation wide reduction in SO 2, NO x and ozone emissions. This law affects specific utilities for SO 2 reduction, and specific regions of the country for NO x and ozone reductions — the latter affecting the utility-, industrial- and transportation-sectors in these regions. The Act does not direct how the reductions are to be achieved; but it specifically establishes a trading market for emission allowances whereby an organization that reduces emissions below its target can sell its unused allowance to another organization. In addition to the Clean Air Act, there are other environmental issues emerging such as controls on CO 2 emissions, possible expansion of the list of controlled emissions, mandated use of alternative fuels in specific transportation districts and restrictions on electrical transmission systems. All of these so-called 'environmental externalities' are now recognized as having a real cost that can be quantified, and factored in to calculations to determine the relative economic standing of various technologies. This in turn justifies a premium price for fuel cells hence the renewed interest in the technology by the utility and transportation market segments.

  18. Fast space-varying convolution using matrix source coding with applications to camera stray light reduction.

    PubMed

    Wei, Jianing; Bouman, Charles A; Allebach, Jan P

    2014-05-01

    Many imaging applications require the implementation of space-varying convolution for accurate restoration and reconstruction of images. Here, we use the term space-varying convolution to refer to linear operators whose impulse response has slow spatial variation. In addition, these space-varying convolution operators are often dense, so direct implementation of the convolution operator is typically computationally impractical. One such example is the problem of stray light reduction in digital cameras, which requires the implementation of a dense space-varying deconvolution operator. However, other inverse problems, such as iterative tomographic reconstruction, can also depend on the implementation of dense space-varying convolution. While space-invariant convolution can be efficiently implemented with the fast Fourier transform, this approach does not work for space-varying operators. So direct convolution is often the only option for implementing space-varying convolution. In this paper, we develop a general approach to the efficient implementation of space-varying convolution, and demonstrate its use in the application of stray light reduction. Our approach, which we call matrix source coding, is based on lossy source coding of the dense space-varying convolution matrix. Importantly, by coding the transformation matrix, we not only reduce the memory required to store it; we also dramatically reduce the computation required to implement matrix-vector products. Our algorithm is able to reduce computation by approximately factoring the dense space-varying convolution operator into a product of sparse transforms. Experimental results show that our method can dramatically reduce the computation required for stray light reduction while maintaining high accuracy.

  19. Four distinct types of E.C. 1.2.1.30 enzymes can catalyze the reduction of carboxylic acids to aldehydes.

    PubMed

    Stolterfoht, Holly; Schwendenwein, Daniel; Sensen, Christoph W; Rudroff, Florian; Winkler, Margit

    2017-09-10

    Increasing demand for chemicals from renewable resources calls for the development of new biotechnological methods for the reduction of oxidized bio-based compounds. Enzymatic carboxylate reduction is highly selective, both in terms of chemo- and product selectivity, but not many carboxylate reductase enzymes (CARs) have been identified on the sequence level to date. Thus far, their phylogeny is unexplored and very little is known about their structure-function-relationship. CARs minimally contain an adenylation domain, a phosphopantetheinylation domain and a reductase domain. We have recently identified new enzymes of fungal origin, using similarity searches against genomic sequences from organisms in which aldehydes were detected upon incubation with carboxylic acids. Analysis of sequences with known CAR functionality and CAR enzymes recently identified in our laboratory suggests that the three-domain architecture mentioned above is modular. The construction of a distance tree with a subsequent 1000-replicate bootstrap analysis showed that the CAR sequences included in our study fall into four distinct subgroups (one of bacterial origin and three of fungal origin, respectively), each with a bootstrap value of 100%. The multiple sequence alignment of all experimentally confirmed CAR protein sequences revealed fingerprint sequences of residues which are likely to be involved in substrate and co-substrate binding and one of the three catalytic substeps, respectively. The fingerprint sequences broaden our understanding of the amino acids that might be essential for the reduction of organic acids to the corresponding aldehydes in CAR proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. A dual tracer ratio method for comparative emission measurements in an experimental dairy housing

    NASA Astrophysics Data System (ADS)

    Mohn, Joachim; Zeyer, Kerstin; Keck, Margret; Keller, Markus; Zähner, Michael; Poteko, Jernej; Emmenegger, Lukas; Schrade, Sabine

    2018-04-01

    Agriculture, and in particular dairy farming, is an important source of ammonia (NH3) and non-carbon dioxide greenhouse gas (GHG) emissions. This calls for the development and quantification of effective mitigation strategies. Our study presents the implementation of a dual tracer ratio method in a novel experimental dairy housing with two identical, but spatially separated housing areas. Modular design and flexible floor elements allow the assessment of structural, process engineering and organisational abatement measures at practical scale. Thereby, the emission reduction potential of specific abatement measures can be quantified in relation to a reference system. Emissions in the naturally ventilated housing are determined by continuous dosing of two artificial tracers (sulphur hexafluoride SF6, trifluoromethylsulphur pentafluoride SF5CF3) and their real-time detection in the ppt range with an optimized GC-ECD method. The two tracers are dosed into different experimental sections, which enables the independent assessment of both housing areas. Mass flow emissions of NH3 and GHGs are quantified by areal dosing of tracer gases and multipoint sampling as well as real-time analysis of both tracer and target gases. Validation experiments demonstrate that the technique is suitable for both areal and point emission sources and achieves an uncertainty of less than 10% for the mass emissions of NH3, methane (CH4) and carbon dioxide (CO2), which is superior to other currently available methods. Comparative emission measurements in this experimental dairy housing will provide reliable, currently unavailable information on emissions for Swiss dairy farming and demonstrate the reduction potential of mitigation measures for NH3, GHGs and potentially other pollutants.

  1. Generalized Predictive Control of Dynamic Systems with Rigid-Body Modes

    NASA Technical Reports Server (NTRS)

    Kvaternik, Raymond G.

    2013-01-01

    Numerical simulations to assess the effectiveness of Generalized Predictive Control (GPC) for active control of dynamic systems having rigid-body modes are presented. GPC is a linear, time-invariant, multi-input/multi-output predictive control method that uses an ARX model to characterize the system and to design the controller. Although the method can accommodate both embedded (implicit) and explicit feedforward paths for incorporation of disturbance effects, only the case of embedded feedforward in which the disturbances are assumed to be unknown is considered here. Results from numerical simulations using mathematical models of both a free-free three-degree-of-freedom mass-spring-dashpot system and the XV-15 tiltrotor research aircraft are presented. In regulation mode operation, which calls for zero system response in the presence of disturbances, the simulations showed reductions of nearly 100%. In tracking mode operations, where the system is commanded to follow a specified path, the GPC controllers produced the desired responses, even in the presence of disturbances.

  2. Using Virtual Social Networks for Case Finding in Clinical Studies: An Experiment from Adolescence, Brain, Cognition, and Diabetes Study.

    PubMed

    Pourabbasi, Ata; Farzami, Jalal; Shirvani, Mahbubeh-Sadat Ebrahimnegad; Shams, Amir Hossein; Larijani, Bagher

    2017-01-01

    One of the main usages of social networks in clinical studies is facilitating the process of sampling and case finding for scientists. The main focus of this study is on comparing two different methods of sampling through phone calls and using social network, for study purposes. One of the researchers started calling 214 families of children with diabetes during 90 days. After this period, phone calls stopped, and the team started communicating with families through telegram, a virtual social network for 30 days. The number of children who participated in the study was evaluated. Although the telegram method was 60 days shorter than the phone call method, researchers found that the number of participants from telegram (17.6%) did not have any significant differences compared with the ones being phone called (12.9%). Using social networks can be suggested as a beneficial method for local researchers who look for easier sampling methods, winning their samples' trust, following up with the procedure, and an easy-access database.

  3. Quantifying Vocal Mimicry in the Greater Racket-Tailed Drongo: A Comparison of Automated Methods and Human Assessment

    PubMed Central

    Agnihotri, Samira; Sundeep, P. V. D. S.; Seelamantula, Chandra Sekhar; Balakrishnan, Rohini

    2014-01-01

    Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential. PMID:24603717

  4. The impact of standalone call centres and GP cooperatives on access to after hours GP care: a before and after study adjusted for secular trend.

    PubMed

    Dunt, David; Day, Susan E; Kelaher, Margaret; Montalto, Michael

    2006-08-01

    The After Hours Primary Medical Trials were initiated by the Australian government to redress difficulties in after hours (AH) GP care in areas of high need. The study's objective is to study the impact of two standalone call centres and one GP cooperative offering comprehensive services, in improving consumer access to services for residents of a defined geographic area. A pre-post design was used to evaluate their impact after adjusting for secular trend at a national level. Access was considered in terms of availability, accessibility, affordability, acceptability and responsiveness of care. Unmet need and ease of obtaining AH telephone professional medical advice were also considered. Pre-trial and post-trial telephone surveys of two separate random samples of approximately 350 households using AH services in each trial area as well as in a national sample outside the trial areas. Consumer acceptability and affordability increased in residents in the area served by the GP cooperative. Access, however measured, did not improve in either of the standalone call centre areas. Reduction in unmet need approached but did not achieve statistical significance in most but not all trial areas. Improvements in access in the GP cooperative conformed to expectations based on current and pre-existing AH care arrangements put in place. Absence of improvements in access in the standalone call centres did not conform to expectations but may be partly explained by the reductions in consumer acceptability, following introduction of telephone triage systems reported elsewhere.

  5. Comparison of four different reduction methods for anterior dislocation of the shoulder.

    PubMed

    Guler, Olcay; Ekinci, Safak; Akyildiz, Faruk; Tirmik, Uzeyir; Cakmak, Selami; Ugras, Akin; Piskin, Ahmet; Mahirogullari, Mahir

    2015-05-28

    Shoulder dislocations account for almost 50% of all major joint dislocations and are mainly anterior. The aim is a comparative retrospective study of different reduction maneuvers without anesthesia to reduce the dislocated shoulder. Patients were treated with different reduction maneuvers, including various forms of traction and external rotation, in the emergency departments of four training hospitals between 2009 and 2012. Each of the four hospitals had different treatment protocols for reduction and applying one of four maneuvers: Spaso, Chair, Kocher, and Matsen methods. Thirty-nine patients were treated by the Spaso method, 47 by the Chair reduction method, 40 by the Kocher method, and 27 patients by Matsen's traction-countertraction method. All patients' demographic data were recorded. Dislocation number, reduction time, time interval between dislocation and reduction, and associated complications, pre- and post-reduction period, were recorded prospectively. No anesthetic method was used for the reduction. All of the methods used included traction and some external rotation. The Chair method had the shortest reduction time. All surgeons involved in the study agreed that the Kocher and Matsen methods needed more force for the reduction. Patients could contract their muscles because of the pain in these two methods. The Spaso method includes flexion of the shoulder and blocks muscle contraction somewhat. The Chair method was found to be the easiest because the patients could not contract their muscles while sitting on a chair with the affected arm at their side. We suggest that the Chair method is an effective and fast reduction maneuver that may be an alternative for the treatment of anterior shoulder dislocations. Further prospective studies with larger sample size are needed to compare safety of different reduction techniques.

  6. Shock simulations of a single-site coarse-grain RDX model using the dissipative particle dynamics method with reactivity

    NASA Astrophysics Data System (ADS)

    Sellers, Michael S.; Lísal, Martin; Schweigert, Igor; Larentzos, James P.; Brennan, John K.

    2017-01-01

    In discrete particle simulations, when an atomistic model is coarse-grained, a tradeoff is made: a boost in computational speed for a reduction in accuracy. The Dissipative Particle Dynamics (DPD) methods help to recover lost accuracy of the viscous and thermal properties, while giving back a relatively small amount of computational speed. Since its initial development for polymers, one of the most notable extensions of DPD has been the introduction of chemical reactivity, called DPD-RX. In 2007, Maillet, Soulard, and Stoltz introduced implicit chemical reactivity in DPD through the concept of particle reactors and simulated the decomposition of liquid nitromethane. We present an extended and generalized version of the DPD-RX method, and have applied it to solid hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX). Demonstration simulations of reacting RDX are performed under shock conditions using a recently developed single-site coarse-grain model and a reduced RDX decomposition mechanism. A description of the methods used to simulate RDX and its transition to hot product gases within DPD-RX is presented. Additionally, we discuss several examples of the effect of shock speed and microstructure on the corresponding material chemistry.

  7. A robust sparse-modeling framework for estimating schizophrenia biomarkers from fMRI.

    PubMed

    Dillon, Keith; Calhoun, Vince; Wang, Yu-Ping

    2017-01-30

    Our goal is to identify the brain regions most relevant to mental illness using neuroimaging. State of the art machine learning methods commonly suffer from repeatability difficulties in this application, particularly when using large and heterogeneous populations for samples. We revisit both dimensionality reduction and sparse modeling, and recast them in a common optimization-based framework. This allows us to combine the benefits of both types of methods in an approach which we call unambiguous components. We use this to estimate the image component with a constrained variability, which is best correlated with the unknown disease mechanism. We apply the method to the estimation of neuroimaging biomarkers for schizophrenia, using task fMRI data from a large multi-site study. The proposed approach yields an improvement in both robustness of the estimate and classification accuracy. We find that unambiguous components incorporate roughly two thirds of the same brain regions as sparsity-based methods LASSO and elastic net, while roughly one third of the selected regions differ. Further, unambiguous components achieve superior classification accuracy in differentiating cases from controls. Unambiguous components provide a robust way to estimate important regions of imaging data. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Corporate ergonomics programme at automobiles Peugeot-Sochaux.

    PubMed

    Moreau, M

    2003-01-01

    An ergonomic assessment tool for design procedures, exclusive to Peugeot-Citroën, called ECM, was developed and applied at the design stage by method technicians in the 1990s. It generates data, which are followed up by the project leader of a new model and ergonomists until two years before each launch. During this time, vehicle design is subject to modification, to adapt to ergonomic demands. Simplified methods (DACORS and METEO) were also developed to assess workstations on the shop floor in trim and final plants. Assessments were used to grade the workstations into four profiles linked to physical and static requirements. Production technicians are responsible for the application of these local methods on the shop floor. The management of these centres aimed to reduce the risks of musculoskeletal disorders by reduced heavy profiles of these stations. New cases of musculoskeletal disorders, surveyed by the company doctor among workers on the assembly lines had decreased since 1996. In 1999, the incidence increased again, despite the pursuit of ergonomic methods. This increase in musculoskeletal disorders was above all linked to a major reorganisation of work conditions, including a reduction in the cycle time on the assembly line, and to a move into a new workshop.

  9. Density-cluster NMA: A new protein decomposition technique for coarse-grained normal mode analysis.

    PubMed

    Demerdash, Omar N A; Mitchell, Julie C

    2012-07-01

    Normal mode analysis has emerged as a useful technique for investigating protein motions on long time scales. This is largely due to the advent of coarse-graining techniques, particularly Hooke's Law-based potentials and the rotational-translational blocking (RTB) method for reducing the size of the force-constant matrix, the Hessian. Here we present a new method for domain decomposition for use in RTB that is based on hierarchical clustering of atomic density gradients, which we call Density-Cluster RTB (DCRTB). The method reduces the number of degrees of freedom by 85-90% compared with the standard blocking approaches. We compared the normal modes from DCRTB against standard RTB using 1-4 residues in sequence in a single block, with good agreement between the two methods. We also show that Density-Cluster RTB and standard RTB perform well in capturing the experimentally determined direction of conformational change. Significantly, we report superior correlation of DCRTB with B-factors compared with 1-4 residue per block RTB. Finally, we show significant reduction in computational cost for Density-Cluster RTB that is nearly 100-fold for many examples. Copyright © 2012 Wiley Periodicals, Inc.

  10. Incorporating biological information in sparse principal component analysis with application to genomic data.

    PubMed

    Li, Ziyi; Safo, Sandra E; Long, Qi

    2017-07-11

    Sparse principal component analysis (PCA) is a popular tool for dimensionality reduction, pattern recognition, and visualization of high dimensional data. It has been recognized that complex biological mechanisms occur through concerted relationships of multiple genes working in networks that are often represented by graphs. Recent work has shown that incorporating such biological information improves feature selection and prediction performance in regression analysis, but there has been limited work on extending this approach to PCA. In this article, we propose two new sparse PCA methods called Fused and Grouped sparse PCA that enable incorporation of prior biological information in variable selection. Our simulation studies suggest that, compared to existing sparse PCA methods, the proposed methods achieve higher sensitivity and specificity when the graph structure is correctly specified, and are fairly robust to misspecified graph structures. Application to a glioblastoma gene expression dataset identified pathways that are suggested in the literature to be related with glioblastoma. The proposed sparse PCA methods Fused and Grouped sparse PCA can effectively incorporate prior biological information in variable selection, leading to improved feature selection and more interpretable principal component loadings and potentially providing insights on molecular underpinnings of complex diseases.

  11. Stable orthogonal local discriminant embedding for linear dimensionality reduction.

    PubMed

    Gao, Quanxue; Ma, Jingjie; Zhang, Hailin; Gao, Xinbo; Liu, Yamin

    2013-07-01

    Manifold learning is widely used in machine learning and pattern recognition. However, manifold learning only considers the similarity of samples belonging to the same class and ignores the within-class variation of data, which will impair the generalization and stableness of the algorithms. For this purpose, we construct an adjacency graph to model the intraclass variation that characterizes the most important properties, such as diversity of patterns, and then incorporate the diversity into the discriminant objective function for linear dimensionality reduction. Finally, we introduce the orthogonal constraint for the basis vectors and propose an orthogonal algorithm called stable orthogonal local discriminate embedding. Experimental results on several standard image databases demonstrate the effectiveness of the proposed dimensionality reduction approach.

  12. Applying a Consumer Behavior Lens to Salt Reduction Initiatives

    PubMed Central

    Potvin Kent, Monique; Raats, Monique M.; McConnon, Áine; Wall, Patrick; Dubois, Lise

    2017-01-01

    Reformulation of food products to reduce salt content has been a central strategy for achieving population level salt reduction. In this paper, we reflect on current reformulation strategies and consider how consumer behavior determines the ultimate success of these strategies. We consider the merits of adopting a ‘health by stealth’, silent approach to reformulation compared to implementing a communications strategy which draws on labeling initiatives in tandem with reformulation efforts. We end this paper by calling for a multi-actor approach which utilizes co-design, participatory tools to facilitate the involvement of all stakeholders, including, and especially, consumers, in making decisions around how best to achieve population-level salt reduction. PMID:28820449

  13. Aging and Visual Impairment.

    ERIC Educational Resources Information Center

    Morse, A. R.; And Others

    1987-01-01

    Eye diseases of the aged include diabetic retinopathy, senile cataracts, senile macular degeneration, and glaucoma. Environmental modifications such as better levels of illumination and reduction of glare can enhance an individual's ability to function. Programs to screen and treat visual problems in elderly persons are called for. (Author/JDD)

  14. Rethinking the Business Model: Responsibilities of Governing Boards

    ERIC Educational Resources Information Center

    Trusteeship, 2012

    2012-01-01

    Colleges and universities are thinking strategically about their business models. Reductions in state and federal appropriations, endowment volatility, fundraising uncertainties, and limits on tuition increases are creating persistent shortfalls in operating budgets. This all comes when institutions are being called upon to enroll and graduate…

  15. Inverse modeling methods for indoor airborne pollutant tracking: literature review and fundamentals.

    PubMed

    Liu, X; Zhai, Z

    2007-12-01

    Reduction in indoor environment quality calls for effective control and improvement measures. Accurate and prompt identification of contaminant sources ensures that they can be quickly removed and contaminated spaces isolated and cleaned. This paper discusses the use of inverse modeling to identify potential indoor pollutant sources with limited pollutant sensor data. The study reviews various inverse modeling methods for advection-dispersion problems and summarizes the methods into three major categories: forward, backward, and probability inverse modeling methods. The adjoint probability inverse modeling method is indicated as an appropriate model for indoor air pollutant tracking because it can quickly find source location, strength and release time without prior information. The paper introduces the principles of the adjoint probability method and establishes the corresponding adjoint equations for both multi-zone airflow models and computational fluid dynamics (CFD) models. The study proposes a two-stage inverse modeling approach integrating both multi-zone and CFD models, which can provide a rapid estimate of indoor pollution status and history for a whole building. Preliminary case study results indicate that the adjoint probability method is feasible for indoor pollutant inverse modeling. The proposed method can help identify contaminant source characteristics (location and release time) with limited sensor outputs. This will ensure an effective and prompt execution of building management strategies and thus achieve a healthy and safe indoor environment. The method can also help design optimal sensor networks.

  16. Is the phone call the most effective method for recall in cervical cancer screening?--results from a randomised control trial.

    PubMed

    Abdul Rashid, Rima Marhayu; Mohamed, Majdah; Hamid, Zaleha Abdul; Dahlui, Maznah

    2013-01-01

    To compare the effectiveness of different methods of recall for repeat Pap smear among women who had normal smears in the previous screening. Prospective randomized controlled study. All community clinics in Klang under the Ministry of Health Malaysia. Women of Klang who attended cervical screening and had a normal Pap smear in the previous year, and were due for a repeat smear were recruited and randomly assigned to four different methods of recall for repeat smear. The recall methods given to the women to remind them for a repeat smear were either by postal letter, registered letter, short message by phone (SMS) or phone call. Number and percentage of women who responded to the recall within 8 weeks after they had received the recall, irrespective whether they had Pap test conducted. Also the numbers of women in each recall method that came for repeat Pap smear. The rates of recall messages reaching the women when using letter, registered letter, SMS and phone calls were 79%, 87%, 66% and 68%, respectively. However, the positive responses to recall by letter, registered letter, phone messages and telephone call were 23.9%, 23.0%, 32.9% and 50.9%, respectively (p<0.05). Furthermore, more women who received recall by phone call had been screened (p<0.05) compared to those who received recall by postal letter (OR=2.38, CI=1.56-3.62). Both the usual way of sending letters and registered letters had higher chances of reaching patients compared to using phone either for sending messages or calling. The response to the recall method and uptake of repeat smear, however, were highest via phone call, indicating the importance of direct communication.

  17. Influence of atmospheric properties on detection of wood-warbler nocturnal flight calls

    NASA Astrophysics Data System (ADS)

    Horton, Kyle G.; Stepanian, Phillip M.; Wainwright, Charlotte E.; Tegeler, Amy K.

    2015-10-01

    Avian migration monitoring can take on many forms; however, monitoring active nocturnal migration of land birds is limited to a few techniques. Avian nocturnal flight calls are currently the only method for describing migrant composition at the species level. However, as this method develops, more information is needed to understand the sources of variation in call detection. Additionally, few studies examine how detection probabilities differ under varying atmospheric conditions. We use nocturnal flight call recordings from captive individuals to explore the dependence of flight call detection on atmospheric temperature and humidity. Height or distance from origin had the largest influence on call detection, while temperature and humidity also influenced detectability at higher altitudes. Because flight call detection varies with both atmospheric conditions and flight height, improved monitoring across time and space will require correction for these factors to generate standardized metrics of songbird migration.

  18. Masking as an effective quality control method for next-generation sequencing data analysis.

    PubMed

    Yun, Sajung; Yun, Sijung

    2014-12-13

    Next generation sequencing produces base calls with low quality scores that can affect the accuracy of identifying simple nucleotide variation calls, including single nucleotide polymorphisms and small insertions and deletions. Here we compare the effectiveness of two data preprocessing methods, masking and trimming, and the accuracy of simple nucleotide variation calls on whole-genome sequence data from Caenorhabditis elegans. Masking substitutes low quality base calls with 'N's (undetermined bases), whereas trimming removes low quality bases that results in a shorter read lengths. We demonstrate that masking is more effective than trimming in reducing the false-positive rate in single nucleotide polymorphism (SNP) calling. However, both of the preprocessing methods did not affect the false-negative rate in SNP calling with statistical significance compared to the data analysis without preprocessing. False-positive rate and false-negative rate for small insertions and deletions did not show differences between masking and trimming. We recommend masking over trimming as a more effective preprocessing method for next generation sequencing data analysis since masking reduces the false-positive rate in SNP calling without sacrificing the false-negative rate although trimming is more commonly used currently in the field. The perl script for masking is available at http://code.google.com/p/subn/. The sequencing data used in the study were deposited in the Sequence Read Archive (SRX450968 and SRX451773).

  19. Balancing quality and cost for Adult Tobacco Telephone Surveys.

    PubMed

    Fernandez, Barbara M; Hannah, Kristie M; Wallack, Randal S Zu; Hicks, Jennifer K D; Gorrigan, Anne M; Mariolis, Peter

    2007-01-01

    To demonstrate the ability to cost-effectively coordinate Adult Tobacco Survey stakeholder interests while reducing the risk of potential bias. Key smoking indicators were compared across 2 surveys and analyzed based on modifications to calling protocols. Mixed results were found when comparing smoking rates across 2 surveys, by early, mid, and late respondents, and by the number of rufusals. Significant cost sayings can be obtained by reducing the number of telephone call attempts. Few significant differences may encourage reductions in protocol, but this must be weighed against the possibility of cost-saving measures resulting in biased estimates.

  20. Efficient model reduction of parametrized systems by matrix discrete empirical interpolation

    NASA Astrophysics Data System (ADS)

    Negri, Federico; Manzoni, Andrea; Amsallem, David

    2015-12-01

    In this work, we apply a Matrix version of the so-called Discrete Empirical Interpolation (MDEIM) for the efficient reduction of nonaffine parametrized systems arising from the discretization of linear partial differential equations. Dealing with affinely parametrized operators is crucial in order to enhance the online solution of reduced-order models (ROMs). However, in many cases such an affine decomposition is not readily available, and must be recovered through (often) intrusive procedures, such as the empirical interpolation method (EIM) and its discrete variant DEIM. In this paper we show that MDEIM represents a very efficient approach to deal with complex physical and geometrical parametrizations in a non-intrusive, efficient and purely algebraic way. We propose different strategies to combine MDEIM with a state approximation resulting either from a reduced basis greedy approach or Proper Orthogonal Decomposition. A posteriori error estimates accounting for the MDEIM error are also developed in the case of parametrized elliptic and parabolic equations. Finally, the capability of MDEIM to generate accurate and efficient ROMs is demonstrated on the solution of two computationally-intensive classes of problems occurring in engineering contexts, namely PDE-constrained shape optimization and parametrized coupled problems.

  1. 2014 Summer Series - Mark Jacobson - Roadmaps for Transitioning All 50 US States to Wind, Water and Solar Power

    NASA Image and Video Library

    2014-07-08

    Global warming, air pollution, and energy insecurity are three of the most significant problems facing the world today. This talk discusses the development of technical and economic plans to convert the energy infrastructure of each of the 50 United States to those powered by 100% wind, water, and sunlight (WWS) for all purposes, namely electricity, transportation, industry, and heating/cooling, after energy efficiency measures have been accounted for. The plans call for all new energy to be WWS by 2020, ~80% conversion of existing energy by 2030, and 100% by 2050 through aggressive policy measures and natural transition. Resource availability, footprint and spacing areas required, jobs created, energy costs, avoided costs from air pollution mortality and morbidity and climate damage, methods of ensuring reliability of the grid, and impacts of offshore wind farms on hurricane dissipation are discussed. Air pollution reductions alone due to the plan would eliminate ~60,000 U.S. premature mortalities, avoiding costs equivalent to 3.2% of the United States GDP. Climate cost reductions are of similar order. The plans stabilize energy prices because fuel costs are zero.

  2. Youth alcohol use and risky sexual behavior: evidence from underage drunk driving laws.

    PubMed

    Carpenter, Christopher

    2005-05-01

    Recent research calls into question previous methods for estimating the relationship between alcohol use and risky sexual behavior among youths [Rashad, I., Kaestner, R., 2004. Teenage sex, drugs and alcohol use: problems identifying the cause of risky behaviors. Journal of Health Economics 23, 493-503]. This paper provides new evidence on this question by using reductions in heavy alcohol use among underage males induced by state adoption of very strict age-targeted "Zero Tolerance" drunk driving laws. I estimate reduced form models of the effects of Zero Tolerance laws on state gonorrhea rates by age group and race over the period 1981-2000, controlling for state and year fixed effects and state-specific time trends. I find that adoption of a Zero Tolerance law was associated with a significant reduction in gonorrhea rates among 15-19-year-old white males, with no effect for slightly older males age 20-24 whose drinking behavior was unaffected by the tougher policies. I find mixed effects for white females and no significant effects for blacks. While not conclusive, these results suggest an important role for alcohol use in risky sexual behavior among young men.

  3. A pilot-scale study of wet torrefaction treatment for upgrading palm oil empty fruit bunches as clean solid fuel

    NASA Astrophysics Data System (ADS)

    Gusman, M. H.; Sastroredjo, P. N. E.; Prawisudha, P.; Hardianto, T.; Pasek, A. D.

    2017-05-01

    Less utilized empty fruit bunch (EFB) is seldom used as solid biofuel due to its high alkali content that potentially cause ash deposit called slagging and fouling. This phenomenon could harm biomass-fired power plant equipment. Some pre-treatment of EFB is needed to reduce EFB ash deposit potential. The effect of wet torrefaction pre-treatment in laboratory scale was successfully proven in decreasing slagging and fouling potential while increasing EFB calorific value that could fulfill clean solid fuel criteria. This research focuses on wet torrefaction process that conducted on a pilot scale with the capacity of 250 liters. It was found that wet torrefaction process can improve the product’s calorific value up to 9.41% while reduce its ash content down to 1.01% comparing to the raw EFB. The reduction of ash content also leads to the reduction of slagging and fouling tendency that presents in terms of alkali index. Alkali index is a quantitative method that can be calculated after obtaining metal oxides fraction on solid fuel. Metal oxides could be obtained by using energy dispersive x-ray spectroscopy.

  4. Decreasing Postanesthesia Care Unit to Floor Transfer Times to Facilitate Short Stay Total Joint Replacements.

    PubMed

    Sibia, Udai S; Grover, Jennifer; Turcotte, Justin J; Seanger, Michelle L; England, Kimberly A; King, Jennifer L; King, Paul J

    2018-04-01

    We describe a process for studying and improving baseline postanesthesia care unit (PACU)-to-floor transfer times after total joint replacements. Quality improvement project using lean methodology. Phase I of the investigational process involved collection of baseline data. Phase II involved developing targeted solutions to improve throughput. Phase III involved measured project sustainability. Phase I investigations revealed that patients spent an additional 62 minutes waiting in the PACU after being designated ready for transfer. Five to 16 telephone calls were needed between the PACU and the unit to facilitate each patient transfer. The most common reason for delay was unavailability of the unit nurse who was attending to another patient (58%). Phase II interventions resulted in transfer times decreasing to 13 minutes (79% reduction, P < .001). Phase III recorded sustained transfer times at 30 minutes, a net 52% reduction (P < .001) from baseline. Lean methodology resulted in the immediate decrease of PACU-to-floor transfer times by 79%, with a 52% sustained improvement. Our methods can also be used to improve efficiencies of care at other institutions. Copyright © 2016 American Society of PeriAnesthesia Nurses. Published by Elsevier Inc. All rights reserved.

  5. 76 FR 47539 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-05

    ... of information under the provisions of the Paperwork Reduction Act (44 U.S.C. chapter 35). Agency: U... critical decision-making on a variety of issues including market trends, analysis, and segmentation. Each... proposal can be obtained by calling or writing Diana Hynek, [[Page 47540

  6. Psycho-Cultural Analysis of Disaster Risk Attitudes in Situation Awareness

    DTIC Science & Technology

    2013-09-01

    the American Geophysical Union called for greater resilience in facing such hazards which would curb damages and economic losses (Lewis, 2012). Most of...Reduction and Climate Change Adaptation. Rugby , UK: Practical Action Publishing, ISBN 978-1-85339-786-8. UNISDR (2009). Terminology on Disaster Risk

  7. Cloudy outlook. Supercommittee failure leaves healthcare providers questioning future cuts, impact on hospitals.

    PubMed

    Zigmond, Jessica

    2011-11-28

    President Barack Obama responded to the failure of the so-called supercommittee with a message that he won't allow Congress to water down the automatic cuts triggered under the August deficit law, which would include a 2% reduction in Medicare payments.

  8. Climate change and frog calls: long-term correlations along a tropical altitudinal gradient

    PubMed Central

    Narins, Peter M.; Meenderink, Sebastiaan W. F.

    2014-01-01

    Temperature affects nearly all biological processes, including acoustic signal production and reception. Here, we report on advertisement calls of the Puerto Rican coqui frog (Eleutherodactylus coqui) that were recorded along an altitudinal gradient and compared these with similar recordings along the same altitudinal gradient obtained 23 years earlier. We found that over this period, at any given elevation, calls exhibited both significant increases in pitch and shortening of their duration. All of the observed differences are consistent with a shift to higher elevations for the population, a well-known strategy for adapting to a rise in ambient temperature. Using independent temperature data over the same time period, we confirm a significant increase in temperature, the magnitude of which closely predicts the observed changes in the frogs’ calls. Physiological responses to long-term temperature rises include reduction in individual body size and concomitantly, population biomass. These can have potentially dire consequences, as coqui frogs form an integral component of the food web in the Puerto Rican rainforest. PMID:24718765

  9. Climate change and frog calls: long-term correlations along a tropical altitudinal gradient.

    PubMed

    Narins, Peter M; Meenderink, Sebastiaan W F

    2014-05-22

    Temperature affects nearly all biological processes, including acoustic signal production and reception. Here, we report on advertisement calls of the Puerto Rican coqui frog (Eleutherodactylus coqui) that were recorded along an altitudinal gradient and compared these with similar recordings along the same altitudinal gradient obtained 23 years earlier. We found that over this period, at any given elevation, calls exhibited both significant increases in pitch and shortening of their duration. All of the observed differences are consistent with a shift to higher elevations for the population, a well-known strategy for adapting to a rise in ambient temperature. Using independent temperature data over the same time period, we confirm a significant increase in temperature, the magnitude of which closely predicts the observed changes in the frogs' calls. Physiological responses to long-term temperature rises include reduction in individual body size and concomitantly, population biomass. These can have potentially dire consequences, as coqui frogs form an integral component of the food web in the Puerto Rican rainforest.

  10. A Multiple Period Problem in Distributed Energy Management Systems Considering CO2 Emissions

    NASA Astrophysics Data System (ADS)

    Muroda, Yuki; Miyamoto, Toshiyuki; Mori, Kazuyuki; Kitamura, Shoichi; Yamamoto, Takaya

    Consider a special district (group) which is composed of multiple companies (agents), and where each agent responds to an energy demand and has a CO2 emission allowance imposed. A distributed energy management system (DEMS) optimizes energy consumption of a group through energy trading in the group. In this paper, we extended the energy distribution decision and optimal planning problem in DEMSs from a single period problem to a multiple periods one. The extension enabled us to consider more realistic constraints such as demand patterns, the start-up cost, and minimum running/outage times of equipment. At first, we extended the market-oriented programming (MOP) method for deciding energy distribution to the multiple periods problem. The bidding strategy of each agent is formulated by a 0-1 mixed non-linear programming problem. Secondly, we proposed decomposing the problem into a set of single period problems in order to solve it faster. In order to decompose the problem, we proposed a CO2 emission allowance distribution method, called an EP method. We confirmed that the proposed method was able to produce solutions whose group costs were close to lower-bound group costs by computational experiments. In addition, we verified that reduction in computational time was achieved without losing the quality of solutions by using the EP method.

  11. Replica exchange with solute tempering: A method for sampling biological systems in explicit water

    NASA Astrophysics Data System (ADS)

    Liu, Pu; Kim, Byungchan; Friesner, Richard A.; Berne, B. J.

    2005-09-01

    An innovative replica exchange (parallel tempering) method called replica exchange with solute tempering (REST) for the efficient sampling of aqueous protein solutions is presented here. The method bypasses the poor scaling with system size of standard replica exchange and thus reduces the number of replicas (parallel processes) that must be used. This reduction is accomplished by deforming the Hamiltonian function for each replica in such a way that the acceptance probability for the exchange of replica configurations does not depend on the number of explicit water molecules in the system. For proof of concept, REST is compared with standard replica exchange for an alanine dipeptide molecule in water. The comparisons confirm that REST greatly reduces the number of CPUs required by regular replica exchange and increases the sampling efficiency. This method reduces the CPU time required for calculating thermodynamic averages and for the ab initio folding of proteins in explicit water. Author contributions: B.J.B. designed research; P.L. and B.K. performed research; P.L. and B.K. analyzed data; and P.L., B.K., R.A.F., and B.J.B. wrote the paper.Abbreviations: REST, replica exchange with solute tempering; REM, replica exchange method; MD, molecular dynamics.*P.L. and B.K. contributed equally to this work.

  12. An Energy-Efficient Multi-Tier Architecture for Fall Detection on Smartphones

    PubMed Central

    Guvensan, M. Amac; Kansiz, A. Oguz; Camgoz, N. Cihan; Turkmen, H. Irem; Yavuz, A. Gokhan; Karsligil, M. Elif

    2017-01-01

    Automatic detection of fall events is vital to providing fast medical assistance to the causality, particularly when the injury causes loss of consciousness. Optimization of the energy consumption of mobile applications, especially those which run 24/7 in the background, is essential for longer use of smartphones. In order to improve energy-efficiency without compromising on the fall detection performance, we propose a novel 3-tier architecture that combines simple thresholding methods with machine learning algorithms. The proposed method is implemented on a mobile application, called uSurvive, for Android smartphones. It runs as a background service and monitors the activities of a person in daily life and automatically sends a notification to the appropriate authorities and/or user defined contacts when it detects a fall. The performance of the proposed method was evaluated in terms of fall detection performance and energy consumption. Real life performance tests conducted on two different models of smartphone demonstrate that our 3-tier architecture with feature reduction could save up to 62% of energy compared to machine learning only solutions. In addition to this energy saving, the hybrid method has a 93% of accuracy, which is superior to thresholding methods and better than machine learning only solutions. PMID:28644378

  13. What is nature capable of? Evidence, ontology and speculative medical humanities.

    PubMed

    Savransky, Martin; Rosengarten, Marsha

    2016-09-01

    Expanding on the recent call for a 'critical medical humanities' to intervene in questions of the ontology of health, this article develops a what we call a 'speculative' orientation to such interventions in relation to some of the ontological commitments on which contemporary biomedical cultures rest. We argue that crucial to this task is an approach to ontology that treats it not as a question of first principles, but as a matter of the consequences of the images of nature that contemporary biomedical research practices espouse when they make claims to evidence, as well as the possible consequences of imagining different worlds in which health and disease processes partake. By attending to the implicit ontological assumptions involved in the method par excellence of biomedical research, namely the randomised controlled trial (RCT), we argue that the mechanistic ontology that tacitly informs evidence-based biomedical research simultaneously authorises a series of problematic consequences for understanding and intervening practically in the concrete realities of health. As a response, we develop an alternative ontological proposition that regards processes of health and disease as always situated achievements. We show that, without disqualifying RCT-based evidence, such a situated ontology enables one to resist the reduction of the realities of health and disease to biomedicine's current forms of explanation. In so doing, we call for medical humanities scholars to actively engage in the speculative question of what nature may be capable of. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Beluga whale (Delphinapterus leucas) vocalizations and call classification from the eastern Beaufort Sea population.

    PubMed

    Garland, Ellen C; Castellote, Manuel; Berchok, Catherine L

    2015-06-01

    Beluga whales, Delphinapterus leucas, have a graded call system; call types exist on a continuum making classification challenging. A description of vocalizations from the eastern Beaufort Sea beluga population during its spring migration are presented here, using both a non-parametric classification tree analysis (CART), and a Random Forest analysis. Twelve frequency and duration measurements were made on 1019 calls recorded over 14 days off Icy Cape, Alaska, resulting in 34 identifiable call types with 83% agreement in classification for both CART and Random Forest analyses. This high level of agreement in classification, with an initial subjective classification of calls into 36 categories, demonstrates that the methods applied here provide a quantitative analysis of a graded call dataset. Further, as calls cannot be attributed to individuals using single sensor passive acoustic monitoring efforts, these methods provide a comprehensive analysis of data where the influence of pseudo-replication of calls from individuals is unknown. This study is the first to describe the vocal repertoire of a beluga population using a robust and repeatable methodology. A baseline eastern Beaufort Sea beluga population repertoire is presented here, against which the call repertoire of other seasonally sympatric Alaskan beluga populations can be compared.

  15. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  16. Screw-Wire Osteo-Traction: An Adjunctive or Alternative Method of Anatomical Reduction of Multisegment Midfacial Fractures? A Description of Technique and Prospective Study of 40 Patients

    PubMed Central

    O'Regan, Barry; Devine, Maria; Bhopal, Sats

    2013-01-01

    Stable anatomical fracture reduction and segment control before miniplate fixation can be difficult to achieve in comminuted midfacial fractures. Fracture mobilization and reduction methods include Gillies elevation, malar hook, and Dingman elevators. No single method is used universally. Disadvantages include imprecise segment alignment and poor segment stability/control. We have employed screw-wire osteo-traction (SWOT) to address this problem. A literature review revealed two published reports. The aims were to evaluate the SWOT technique effectiveness as a fracture reduction method and to examine rates of revision fixation and plate removal. We recruited 40 consecutive patients requiring open reduction and internal fixation of multisegment midfacial fractures (2009–2012) and employed miniplate osteosynthesis in all patients. SWOT was used as a default reduction method in all patients. The rates of successful fracture reduction achieved by SWOT alone or in combination and of revision fixation and plate removal, were used as outcome indices of the reduction method effectiveness. The SWOT technique achieved satisfactory anatomical reduction in 27/40 patients when used alone. Other reduction methods were also used in 13/40 patients. No patient required revision fixation and three patients required late plate removal. SWOT can be used across the midface fracture pattern in conjunction with other methods or as a sole reduction method before miniplate fixation. PMID:24436763

  17. A Novel Coarsening Method for Scalable and Efficient Mesh Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, A; Hysom, D; Gunney, B

    2010-12-02

    In this paper, we propose a novel mesh coarsening method called brick coarsening method. The proposed method can be used in conjunction with any graph partitioners and scales to very large meshes. This method reduces problem space by decomposing the original mesh into fixed-size blocks of nodes called bricks, layered in a similar way to conventional brick laying, and then assigning each node of the original mesh to appropriate brick. Our experiments indicate that the proposed method scales to very large meshes while allowing simple RCB partitioner to produce higher-quality partitions with significantly less edge cuts. Our results further indicatemore » that the proposed brick-coarsening method allows more complicated partitioners like PT-Scotch to scale to very large problem size while still maintaining good partitioning performance with relatively good edge-cut metric. Graph partitioning is an important problem that has many scientific and engineering applications in such areas as VLSI design, scientific computing, and resource management. Given a graph G = (V,E), where V is the set of vertices and E is the set of edges, (k-way) graph partitioning problem is to partition the vertices of the graph (V) into k disjoint groups such that each group contains roughly equal number of vertices and the number of edges connecting vertices in different groups is minimized. Graph partitioning plays a key role in large scientific computing, especially in mesh-based computations, as it is used as a tool to minimize the volume of communication and to ensure well-balanced load across computing nodes. The impact of graph partitioning on the reduction of communication can be easily seen, for example, in different iterative methods to solve a sparse system of linear equation. Here, a graph partitioning technique is applied to the matrix, which is basically a graph in which each edge is a non-zero entry in the matrix, to allocate groups of vertices to processors in such a way that many of matrix-vector multiplication can be performed locally on each processor and hence to minimize communication. Furthermore, a good graph partitioning scheme ensures the equal amount of computation performed on each processor. Graph partitioning is a well known NP-complete problem, and thus the most commonly used graph partitioning algorithms employ some forms of heuristics. These algorithms vary in terms of their complexity, partition generation time, and the quality of partitions, and they tend to trade off these factors. A significant challenge we are currently facing at the Lawrence Livermore National Laboratory is how to partition very large meshes on massive-size distributed memory machines like IBM BlueGene/P, where scalability becomes a big issue. For example, we have found that the ParMetis, a very popular graph partitioning tool, can only scale to 16K processors. An ideal graph partitioning method on such an environment should be fast and scale to very large meshes, while producing high quality partitions. This is an extremely challenging task, as to scale to that level, the partitioning algorithm should be simple and be able to produce partitions that minimize inter-processor communications and balance the load imposed on the processors. Our goals in this work are two-fold: (1) To develop a new scalable graph partitioning method with good load balancing and communication reduction capability. (2) To study the performance of the proposed partitioning method on very large parallel machines using actual data sets and compare the performance to that of existing methods. The proposed method achieves the desired scalability by reducing the mesh size. For this, it coarsens an input mesh into a smaller size mesh by coalescing the vertices and edges of the original mesh into a set of mega-vertices and mega-edges. A new coarsening method called brick algorithm is developed in this research. In the brick algorithm, the zones in a given mesh are first grouped into fixed size blocks called bricks. These brick are then laid in a way similar to conventional brick laying technique, which reduces the number of neighboring blocks each block needs to communicate. Contributions of this research are as follows: (1) We have developed a novel method that scales to a really large problem size while producing high quality mesh partitions; (2) We measured the performance and scalability of the proposed method on a machine of massive size using a set of actual large complex data sets, where we have scaled to a mesh with 110 million zones using our method. To the best of our knowledge, this is the largest complex mesh that a partitioning method is successfully applied to; and (3) We have shown that proposed method can reduce the number of edge cuts by as much as 65%.« less

  18. EV Everywhere Grand Challenge Road to Success

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    2014-01-31

    Initial progress report for EV Everywhere. The report highlights the significant cost reduction in batteries in 2014, which will enable increased PEV affordability for consumers. Also, the efforts on increasing the convenience of PEVs through the Workplace Charging Challenge, which called on U.S. employers to help develop the nation's charging infrastructure.

  19. Educational States of Suspension

    ERIC Educational Resources Information Center

    Lewis, Tyson E.; Friedrich, Daniel

    2016-01-01

    In response to the growing emphasis on learning outcomes, life-long learning, and what could be called the learning society, scholars are turning to alternative educational logics that problematize the reduction of education to learning. In this article, we draw on these critics but also extend their thinking in two ways. First, we use Giorgio…

  20. What Is the True Nitrogenase Reaction? A Guided Approach

    ERIC Educational Resources Information Center

    Ipata, Piero L.; Pesi, Rossana

    2015-01-01

    Only diazotrophic bacteria, called "Rizhobia," living as symbionts in the root nodules of leguminous plants and certain free-living prokaryotic cells can fix atmospheric N[subscript 2]. In these microorganisms, nitrogen fixation is carried out by the nitrogenase protein complex. However, the reduction of nitrogen to ammonia has an…

  1. 75 FR 44758 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-29

    ... the above information collection proposal can be obtained by calling or writing Diana Hynek... submit to the Office of Management and Budget (OMB) for clearance the following proposal for collection of information under the provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35). Agency...

  2. 38 CFR 21.7135 - Discontinuance dates.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...; Pub. L. 98-525) (c) Divorce. If the veteran becomes divorced, the effective date of reduction of his or her educational assistance is the last day of the month in which the divorce occurs. (Authority...) If the veteran or servicemember, for reasons other than being called or ordered to active duty...

  3. 75 FR 70929 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-19

    ... Project Data Calls for the Laboratory Response Network--Existing collection in use without an OMB Control... Response Network (LRN) was established by the Department of Health and Human Services, Centers for Disease... LRN's mission is to maintain an integrated national and international network of laboratories that can...

  4. Defense Threat Reduction Agency > Careers > Strategic Recruiting Programs

    Science.gov Websites

    graduate science, mathematics and engineering students. Students are offered these scholarships and graduate science, mathematics and engineering students. Students are offered scholarships and fellowships with disabilities, please call (703) 767-4451. Workforce Recruitment Program for College Students with

  5. Consumption of honey, sucrose, and high fructose corn syrup produce similar metabolic effects in glucose tolerant and glucose intolerant individuals

    USDA-ARS?s Scientific Manuscript database

    Background: Current public health recommendations call for reduction of added sugars; however, controversy exits over whether all nutritive sweeteners produce similar metabolic effects. Objective: To compare effects of chronic consumption of three nutritive sweeteners (honey, sucrose and high fructo...

  6. Waking Up to Waste

    ERIC Educational Resources Information Center

    Vrdlovcova, Jill

    2005-01-01

    All homes and schools produce waste. Children may have been astonished at how much people throw away, and this could be the "wake-up call" that arouses their interest. At Carymoor Environmental Centre (an Eco-Centre in South Somerset) getting children involved in active waste reduction and recycling is a priority. Carymoor tries to model…

  7. Honduran-U.S. Relations

    DTIC Science & Technology

    2010-02-01

    government resources to finance poverty-reduction programs. Nonetheless, Honduras continues to face a poverty rate of nearly 70%, in addition to widespread...d’état” in Honduras (H.Res. 630, Delahunt) and calling upon the Micheletti government to end its “illegal seizure of power”(H.Res. 620, Serrano...2 Micheletti Government

  8. Government Accountability Reports and Public Education Policy: Studying Political Actors' Decision-Making

    ERIC Educational Resources Information Center

    Salazar, Timothy Ross

    2013-01-01

    This study asks how government accountability reports are used to influence public education policy. Government accountability reports, called "audits" in Utah, prove to be useful tools for examining education policy. Using a collective case study design examining Utah's Class Size Reduction (CSR) policy, government accountability…

  9. 76 FR 18165 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-01

    ... from remote locations by calling into a central phone number. DATES: The ACEHR will hold a meeting via... phone number is (301) 975-5640. SUPPLEMENTARY INFORMATION: The Committee was established in accordance... Harman's e-mail address is [email protected] , and her phone number is 301-975-5324. Approximately...

  10. Management of occult stress urinary incontinence with prolapse surgery.

    PubMed

    Al-Mandeel, H; Al-Badr, A

    2013-08-01

    Pelvic organ prolapse (POP) and stress urinary incontinence (SUI), are two common health-related conditions, each affecting up to 50% women worldwide. Stress urinary incontinence only observed after the reduction of co-existent prolapse is called occult SUI (OSUI), and is found in up to 80% of women with advanced POP. Although there is no consensus on how to diagnose OSUI, there are several reported methods to better diagnose. Counseling symptomatically continent women with POP concerning the potential risk for developing SUI postoperatively cannot be overstated. Evidence suggests that positive OSUI in symptomatically continent women who are planning to have POP repair is associated with a high risk of POSUI, furthermore, adding continence procedure is found to reduce postoperative SUI. Therefore, adding continence surgery at the time of POP surgery in patients who are found to have OSUI preoperatively is advocated.

  11. On Cross-talk Correction of Images from Multiple-port CCDs

    NASA Astrophysics Data System (ADS)

    Freyhammer, L. M.; Andersen, M. I.; Arentoft, T.; Sterken, C.; Nørregaard, P.

    Multi-channel CCD read-out, which is an option offered at most optical observatories, can significantly reduce the time spent on reading the detector. The penalty of using this option is the so-called amplifier cross-talk, which causes contamination across the output amplifiers, typically at the level of 1:10 000. This can be a serious problem for applications where high precision and/or high contrast is of importance. We represent an analysis of amplifier cross-talk for two instruments - FORS1 at the ESO VLT telescope Antu (Paranal) and DFOSC at the Danish 1.54 m telescope (La Silla) - and present a post-processing method for removing the imprint of cross-talk. It is found that cross-talk may significantly contaminate high-precision photometry in crowded fields, but it can be effectively eliminated during data reduction.

  12. Simulation study on compressive laminar optical tomography for cardiac action potential propagation

    PubMed Central

    Harada, Takumi; Tomii, Naoki; Manago, Shota; Kobayashi, Etsuko; Sakuma, Ichiro

    2017-01-01

    To measure the activity of tissue at the microscopic level, laminar optical tomography (LOT), which is a microscopic form of diffuse optical tomography, has been developed. However, obtaining sufficient recording speed to determine rapidly changing dynamic activity remains major challenges. For a high frame rate of the reconstructed data, we here propose a new LOT method using compressed sensing theory, called compressive laminar optical tomography (CLOT), in which novel digital micromirror device-based illumination and data reduction in a single reconstruction are applied. In the simulation experiments, the reconstructed volumetric images of the action potentials that were acquired from 5 measured images with random pattern featured a wave border at least to a depth of 2.5 mm. Consequently, it was shown that CLOT has potential for over 200 fps required for the cardiac electrophysiological phenomena. PMID:28736675

  13. Quadratic String Method for Locating Instantons in Tunneling Splitting Calculations.

    PubMed

    Cvitaš, Marko T

    2018-03-13

    The ring-polymer instanton (RPI) method is an efficient technique for calculating approximate tunneling splittings in high-dimensional molecular systems. In the RPI method, tunneling splitting is evaluated from the properties of the minimum action path (MAP) connecting the symmetric wells, whereby the extensive sampling of the full potential energy surface of the exact quantum-dynamics methods is avoided. Nevertheless, the search for the MAP is usually the most time-consuming step in the standard numerical procedures. Recently, nudged elastic band (NEB) and string methods, originaly developed for locating minimum energy paths (MEPs), were adapted for the purpose of MAP finding with great efficiency gains [ J. Chem. Theory Comput. 2016 , 12 , 787 ]. In this work, we develop a new quadratic string method for locating instantons. The Euclidean action is minimized by propagating the initial guess (a path connecting two wells) over the quadratic potential energy surface approximated by means of updated Hessians. This allows the algorithm to take many minimization steps between the potential/gradient calls with further reductions in the computational effort, exploiting the smoothness of potential energy surface. The approach is general, as it uses Cartesian coordinates, and widely applicable, with computational effort of finding the instanton usually lower than that of determining the MEP. It can be combined with expensive potential energy surfaces or on-the-fly electronic-structure methods to explore a wide variety of molecular systems.

  14. Assets as a Socioeconomic Status Index: Categorical Principal Components Analysis vs. Latent Class Analysis.

    PubMed

    Sartipi, Majid; Nedjat, Saharnaz; Mansournia, Mohammad Ali; Baigi, Vali; Fotouhi, Akbar

    2016-11-01

    Some variables like Socioeconomic Status (SES) cannot be directly measured, instead, so-called 'latent variables' are measured indirectly through calculating tangible items. There are different methods for measuring latent variables such as data reduction methods e.g. Principal Components Analysis (PCA) and Latent Class Analysis (LCA). The purpose of our study was to measure assets index- as a representative of SES- through two methods of Non-Linear PCA (NLPCA) and LCA, and to compare them for choosing the most appropriate model. This was a cross sectional study in which 1995 respondents filled the questionnaires about their assets in Tehran. The data were analyzed by SPSS 19 (CATPCA command) and SAS 9.2 (PROC LCA command) to estimate their socioeconomic status. The results were compared based on the Intra-class Correlation Coefficient (ICC). The 6 derived classes from LCA based on BIC, were highly consistent with the 6 classes from CATPCA (Categorical PCA) (ICC = 0.87, 95%CI: 0.86 - 0.88). There is no gold standard to measure SES. Therefore, it is not possible to definitely say that a specific method is better than another one. LCA is a complicated method that presents detailed information about latent variables and required one assumption (local independency), while NLPCA is a simple method, which requires more assumptions. Generally, NLPCA seems to be an acceptable method of analysis because of its simplicity and high agreement with LCA.

  15. Decoupling Identification for Serial Two-Link Two-Inertia System

    NASA Astrophysics Data System (ADS)

    Oaki, Junji; Adachi, Shuichi

    The purpose of our study is to develop a precise model by applying the technique of system identification for the model-based control of a nonlinear robot arm, under taking joint-elasticity into consideration. We previously proposed a systematic identification method, called “decoupling identification,” for a “SCARA-type” planar two-link robot arm with elastic joints caused by the Harmonic-drive® reduction gears. The proposed method serves as an extension of the conventional rigid-joint-model-based identification. The robot arm is treated as a serial two-link two-inertia system with nonlinearity. The decoupling identification method using link-accelerometer signals enables the serial two-link two-inertia system to be divided into two linear one-link two-inertia systems. The MATLAB®'s commands for state-space model estimation are utilized in the proposed method. Physical parameters such as motor inertias, link inertias, joint-friction coefficients, and joint-spring coefficients are estimated through the identified one-link two-inertia systems using a gray-box approach. This paper describes accuracy evaluations using the two-link arm for the decoupling identification method under introducing closed-loop-controlled elements and varying amplitude-setup of identification-input. Experimental results show that the identification method also works with closed-loop-controlled elements. Therefore, the identification method is applicable to a “PUMA-type” vertical robot arm under gravity.

  16. Semisupervised kernel marginal Fisher analysis for face recognition.

    PubMed

    Wang, Ziqiang; Sun, Xia; Sun, Lijun; Huang, Yuchun

    2013-01-01

    Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm.

  17. Prejudice reduction: what works? A review and assessment of research and practice.

    PubMed

    Paluck, Elizabeth Levy; Green, Donald P

    2009-01-01

    This article reviews the observational, laboratory, and field experimental literatures on interventions for reducing prejudice. Our review places special emphasis on assessing the methodological rigor of existing research, calling attention to problems of design and measurement that threaten both internal and external validity. Of the hundreds of studies we examine, a small fraction speak convincingly to the questions of whether, why, and under what conditions a given type of intervention works. We conclude that the causal effects of many widespread prejudice-reduction interventions, such as workplace diversity training and media campaigns, remain unknown. Although some intergroup contact and cooperation interventions appear promising, a much more rigorous and broad-ranging empirical assessment of prejudice-reduction strategies is needed to determine what works.

  18. Silicene catalyzed reduction of nitrobenzene to aniline: A mechanistic study

    NASA Astrophysics Data System (ADS)

    Morrissey, Christopher; He, Haiying

    2018-03-01

    The reduction of nitrobenzene to aniline has broad applications in chemical and pharmaceutical industries. The high reaction temperatures and pressures and unavoidable hazardous chemicals of current metal catalysts call for more environmentally friendly non-metal catalysts. In this study, the plausibility of silicene as a potential catalyst for nitrobenzene reduction is investigated with a focus on the distinct reaction mechanism based on the density functional theory. The direct reaction pathway was shown to be distinctly different from the Haber mechanism following PhNO2∗ → PhNO∗ → PhNHO∗ → PhNH2O∗ → PhNH2∗. The hydroxyl groups remain bound to silicene after aniline is formed and acquire a high activation barrier to remove.

  19. Genome-wide single-nucleotide polymorphism arrays demonstrate high fidelity of multiple displacement-based whole-genome amplification.

    PubMed

    Tzvetkov, Mladen V; Becker, Christian; Kulle, Bettina; Nürnberg, Peter; Brockmöller, Jürgen; Wojnowski, Leszek

    2005-02-01

    Whole-genome DNA amplification by multiple displacement (MD-WGA) is a promising tool to obtain sufficient DNA amounts from samples of limited quantity. Using Affymetrix' GeneChip Human Mapping 10K Arrays, we investigated the accuracy and allele amplification bias in DNA samples subjected to MD-WGA. We observed an excellent concordance (99.95%) between single-nucleotide polymorphisms (SNPs) called both in the nonamplified and the corresponding amplified DNA. This concordance was only 0.01% lower than the intra-assay reproducibility of the genotyping technique used. However, MD-WGA failed to amplify an estimated 7% of polymorphic loci. Due to the algorithm used to call genotypes, this was detected only for heterozygous loci. We achieved a 4.3-fold reduction of noncalled SNPs by combining the results from two independent MD-WGA reactions. This indicated that inter-reaction variations rather than specific chromosomal loci reduced the efficiency of MD-WGA. Consistently, we detected no regions of reduced amplification, with the exception of several SNPs located near chromosomal ends. Altogether, despite a substantial loss of polymorphic sites, MD-WGA appears to be the current method of choice to amplify genomic DNA for array-based SNP analyses. The number of nonamplified loci can be substantially reduced by amplifying each DNA sample in duplicate.

  20. Conduits to care: call lights and patients’ perceptions of communication

    PubMed Central

    Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G

    2017-01-01

    Background Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients’ perceptions concerning call light use and communication. The specific aim of this study was to solicit patients’ perceptions regarding their call light use and communication with nursing staff. Methods Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Results Using qualitative descriptive methods, five major themes emerged from patients’ perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants’ perceptions of “nurse work”). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Discussion Findings from this study extend the knowledge of patients’ understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery. PMID:29075125

  1. A strategy for improved computational efficiency of the method of anchored distributions

    NASA Astrophysics Data System (ADS)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  2. van Manen's method and reduction in a phenomenological hermeneutic study.

    PubMed

    Heinonen, Kristiina

    2015-03-01

    To describe van Manen's method and concept of reduction in a study that used a phenomenological hermeneutic approach. Nurse researchers have used van Manen's method in different ways. Participants' lifeworlds are described in depth, but descriptions of reduction have been brief. The literature and knowledge review and manual search of research articles. Databases Web Science, PubMed, CINAHL and PsycINFO, without applying a time period, to identify uses of van Manen's method. This paper shows how van Manen's method has been used in nursing research and gives some examples of van Manen's reduction. Reduction enables us to conduct in-depth phenomenological hermeneutic research and understand people's lifeworlds. As there are many variations in adapting reduction, it is complex and confusing. This paper contributes to the discussion of phenomenology, hermeneutic study and reduction. It opens up reduction as a method for researchers to exploit.

  3. Lessons learned from recruiting socioeconomically disadvantaged smokers into a pilot randomized controlled trial to explore the role of Exercise Assisted Reduction then Stop (EARS) smoking.

    PubMed

    Thompson, Tom P; Greaves, Colin J; Ayres, Richard; Aveyard, Paul; Warren, Fiona C; Byng, Richard; Taylor, Rod S; Campbell, John L; Ussher, Michael; Michie, Susan; West, Robert; Taylor, Adrian H

    2015-02-12

    Research is needed on what influences recruitment to smoking reduction trials, and how to increase their reach. The present study aimed to i) assess the feasibility of recruiting a disadvantaged population, ii) examine the effects of recruitment methods on participant characteristics, iii) identify resource requirements for different recruitment methods, and iv) to qualitatively assess the acceptability of recruitment. This was done as part of a pilot two-arm trial of the effectiveness of a novel behavioral support intervention focused on increasing physical activity and reducing smoking, among disadvantaged smokers not wishing to quit. Smokers were recruited through mailed invitations from three primary care practices (62 participants) and one National Health Stop Smoking Service (SSS) database (31 participants). Six other participants were recruited via a variety of other community-based approaches. Data were collected through questionnaires, field notes, work sampling, and databases. Chi-squared and t-tests were used to compare baseline characteristics of participants. We randomized between 5.1 and 11.1% of those invited through primary care and SSS, with associated researcher time to recruit one participant varying from 18 to 157 minutes depending on time and intensity invested.Only six participants were recruited through a wide variety of other community-based approaches, with an associated researcher time of 469 minutes to recruit one participant. Targets for recruiting a disadvantaged population were met, with 91% of the sample in social classes C2 to E (NRS social grades, UK), and 41% indicating mental health problems. Those recruited from SSS were more likely to respond to an initial letter, had used cessation aids before, and had attempted to quit in the past year. Overall, initial responders were more likely to be physically active than those who were recruited via follow-up telephone calls. No other demographics or behaviour characteristics were associated with recruitment approach or intensity of effort. Qualitative feedback indicated that participants had been attracted by the prospect of support that focused on smoking reduction rather than abrupt quitting. Mailed invitations, and follow-up, from health professionals was an effective method of recruiting disadvantaged smokers into a trial of an exercise intervention to aid smoking reduction. Recruitment via community outreach approaches was largely ineffective. ISRCTN identifier: 13837944 , registered on 6 July 2010.

  4. Comparing minimally supervised home-based and closely supervised gym-based exercise programs in weight reduction and insulin resistance after bariatric surgery: A randomized clinical trial.

    PubMed

    Kaviani, Sara; Dadgostar, Haleh; Mazaherinezhad, Ali; Adib, Hanie; Solaymani-Dodaran, Masoud; Soheilipour, Fahimeh; Hakiminezhad, Mahdi

    2017-01-01

    Background: Effectiveness of various exercise protocols in weight reduction after bariatric surgery has not been sufficiently explored in the literature. Thus, in the present study, we aimed at comparing the effect of minimally supervised home-based and closely supervised gym-based exercise programs on weight reduction and insulin resistance after bariatric surgery. Methods: Females undergoing gastric bypass surgery were invited to participate in an exercise program and were randomly allocated into 2 groups using a random number generator in Excel. They were either offered a minimally supervised home-based (MSHB) or closely supervised gym-based (CSGB) exercise program. The CSGB protocol constitutes 2 weekly training sessions under ACSM guidelines. In the MSHB protocol, the participants received a notebook containing a list of recommended aerobic and resistance exercises, a log to record their activity, and a schedule of follow-up phone calls and clinic visits. Both groups received a pedometer. We measured their weight, BMI, lipid profile, FBS, and insulin level at baseline and at 20 weeks after the exercises, the results of which were compared using t test or Mann-Whitney U test at the end of the study. All the processes were observed by 1 senior resident in sport medicine. Results: A total of 80 patients were recruited who were all able to complete our study (MSHB= 38 and CSGB= 42). The baseline comparison revealed that the 2 groups were similar. The mean change (reduction) in BMI was slightly better in CSGB (8.61 95% CI 7.76-9.45) compared with the MSHB (5.18 95% CI 3.91-6.46); p< 0.01. However, the 2 groups did not have a statistically significant difference in the amount of change in the other factors including FBS and Homa.ir. Conclusion: As we expected a non-inferiority result, our results showed that both MSHB and CSGB exercise methods are somewhat equally effective in improving lipid profile and insulin resistance in the 2 groups, but a slightly better effect on BMI was observed in CSGB group. With considerably lower costs of minimally supervised home- based exercise programs, both methods should be considered when there is lack of adequate funding.

  5. Reductive capacity measurement of waste forms for secondary radioactive wastes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Um, Wooyong; Yang, Jung-Seok; Serne, R. Jeffrey

    2015-12-01

    The reductive capacities of dry ingredients and final solid waste forms were measured using both the Cr(VI) and Ce(IV) methods and the results were compared. Blast furnace slag (BFS), sodium sulfide, SnF2, and SnCl2 used as dry ingredients to make various waste forms showed significantly higher reductive capacities compared to other ingredients regardless of which method was used. Although the BFS exhibits appreciable reductive capacity, it requires greater amounts of time to fully react. In almost all cases, the Ce(IV) method yielded larger reductive capacity values than those from the Cr(VI) method and can be used as an upper boundmore » for the reductive capacity of the dry ingredients and waste forms, because the Ce(IV) method subjects the solids to a strong acid (low pH) condition that dissolves much more of the solids. Because the Cr(VI) method relies on a neutral pH condition, the Cr(VI) method can be used to estimate primarily the waste form surface-related and readily dissolvable reductive capacity. However, the Cr(VI) method does not measure the total reductive capacity of the waste form, the long-term reductive capacity afforded by very slowly dissolving solids, or the reductive capacity present in the interior pores and internal locations of the solids.« less

  6. Lunar Regolith Simulant Feed System for a Hydrogen Reduction Reactor System

    NASA Technical Reports Server (NTRS)

    Mueller, R. P.; Townsend, Ivan I., III

    2009-01-01

    One of the goals of In-Situ Resource Utilization (ISRU) on the moon is to produce oxygen from the lunar regolith which is present in the form of Ilmenite (FeTi03) and other compounds. A reliable and attainable method of extracting some of the oxygen from the lunar regolith is to use the hydrogen reduction process in a hot reactor to create water vapor which is then condensed and electrolyzed to obtain oxygen for use as a consumable. One challenge for a production system is to reliably acquire the regolith with an excavator hauler mobility platform and then introduce it into the reactor inlet tube which is raised from the surface and above the reactor itself. After the reaction, the hot regolith (-1000 C) must be expelled from the reactor for disposal by the excavator hauler mobility system. In addition, the reactor regolith inlet and outlet tubes must be sealed by valves during the reaction in order to allow collection of the water vapor by the chemical processing sub-system. These valves must be able to handle abrasive regolith passing through them as well as the heat conduction from the hot reactor. In 2008, NASA has designed and field tested a hydrogen reduction system called ROxygen in order to demonstrate the feasibility of extracting oxygen from lunar regolith. The field test was performed with volcanic ash known as Tephra on Mauna Kea volcano on the Big Island of Hawai'i. The tephra has similar properties to lunar regolith, so that it is regarded as a good simulant for the hydrogen reduction process. This paper will discuss the design, fabrication, operation, test results and lessons learned with the ROxygen regolith feed system as tested on Mauna Kea in November 2008.

  7. A versatile breast reduction technique: Conical plicated central U shaped (COPCUs) mammaplasty

    PubMed Central

    Copcu, Eray

    2009-01-01

    Background There have been numerous studies on reduction mammaplasty and its modifications in the literature. The multitude of modifications of reduction mammaplasty indicates that the ideal technique has yet to be found. There are four reasons for seeking the ideal technique. One reason is to preserve functional features of the breast: breastfeeding and arousal. Other reasons are to achieve the real geometric and aesthetic shape of the breast with the least scar and are to minimize complications of prior surgical techniques without causing an additional complication. Last reason is the limitation of the techniques described before. To these aims, we developed a new versatile reduction mammaplasty technique, which we called conical plicated central U shaped (COPCUs) mammaplasty. Methods We performed central plication to achieve a juvenile look in the superior pole of the breast and to prevent postoperative pseudoptosis and used central U shaped flap to achieve maximum NAC safety and to preserve lactation and nipple sensation. The central U flap was 6 cm in width and the superior conical plication was performed with 2/0 PDS. Preoperative and postoperative standard measures of the breast including the superior pole fullness were compared. Results Forty six patients were operated with the above mentioned technique. All of the patients were satisfied with functional and aesthetic results and none of them had major complications. There were no changes in the nipple innervation. Six patients becoming pregnant after surgery did not experience any problems with lactation. None of the patients required scar revision. Conclusion Our technique is a versatile, safe, reliable technique which creates the least scar, avoids previously described disadvantages, provides maximum preservation of functions, can be employed in all breasts regardless of their sizes. PMID:19575809

  8. Environmental Cues and Attempts to Change in Daily Cannabis Users: An Intensive Longitudinal Study

    PubMed Central

    Hughes, John R.; Naud, Shelly; Budney, Alan J.; Fingar, James R.; Callas, Peter W.

    2016-01-01

    Introduction We tested whether environmental cues prompt or inhibit quit or reduction attempts among heavy cannabis users. Methods We recruited 196 daily cannabis users who intended to stop or reduce at some point in the next 3 months. Users called an Interactive Voice Response system daily over 3 months to report on cues that might prompt an attempt to quit or reduce (e.g., a request to stop), cues that might inhibit a quit/reduction attempt (e.g., someone offering cannabis), cannabis use, and attempts to stop or reduce cannabis. No treatment was provided. Results Our major findings were a) cost and health/psychological problems were the most common prompting cues, and seeing others use and being offered cannabis were the most common inhibiting cues, b) the number of different types of prompting cues prospectively predicted an increase in attempts to change in a dose-related manner, c) more proximal cues appeared to be more strongly related to change, d) requests to stop or reduce, and physical or psychological problems from cannabis, best predicted change attempts, and e) inhibiting cues did not consistently predict the probability of an attempt to change. Conclusion These preliminary results suggest several environmental cues prompt attempts to change cannabis use. Thus, interventions to increase the frequency of these cues, and specifically requests to stop or reduce cannabis use, and reinforcing concerns about health and mental adverse events from cannabis use may increase cannabis reduction or cessation. PMID:26872879

  9. Effectiveness of contact-based education for reducing mental illness-related stigma in pharmacy students.

    PubMed

    Patten, Scott B; Remillard, Alfred; Phillips, Leslie; Modgill, Geeta; Szeto, Andrew Ch; Kassam, Aliya; Gardner, David M

    2012-12-05

    A strategy for reducing mental illness-related stigma in health-profession students is to include contact-based sessions in their educational curricula. In such sessions students are able to interact socially with a person that has a mental illness. We sought to evaluate the effectiveness of this strategy in a multi-centre study of pharmacy students. The study was a randomized controlled trial conducted at three sites. Because it was necessary that all students receive the contact-based sessions, the students were randomized either to an early or late intervention, with the late intervention group not having participated in the contact-based education at the time when the primary outcome was assessed. The primary outcome, stigma, was assessed using an attitudes scale called the Opening Minds Survey for Health Care Providers (OMS-HC). We initially confirmed that outcomes were homogeneous across study centres, centre by group interaction, p = 0.76. The results were pooled across the three study centres. A significant reduction in stigma was observed in association with the contact-based sessions (mean change 4.3 versus 1.5, t=2.1, p=0.04). The effect size (Cohen's d) was 0.45. A similar reduction was seen in the control group when they later received the intervention. Contact-based education is an effective method of reducing stigma during pharmacy education. These results add to a growing literature confirming the effectiveness of contact-based strategies for stigma reduction in health profession trainees.

  10. Damage detection of building structures under ambient excitation through the analysis of the relationship between the modal participation ratio and story stiffness

    NASA Astrophysics Data System (ADS)

    Park, Hyo Seon; Oh, Byung Kwan

    2018-03-01

    This paper presents a new approach for the damage detection of building structures under ambient excitation based on the inherent modal characteristics. In this study, without the extraction of modal parameters widely utilized in the previous studies on damage detection, a new index called the modal participation ratio (MPR), which is a representative value of the modal response extracted from dynamic responses measured in ambient vibration tests, is proposed to evaluate the change of the system of a structure according to the reduction of the story stiffness. The relationship between the MPR, representing a modal contribution for a specific mode and degree of freedom in buildings, and the story stiffness damage factor (SSDF), representing the extent of reduction in the story stiffness, is analyzed in various damage scenarios. From the analyses with three examples, several rules for the damage localization of building structures are found based on the characteristics of the MPR variation for the first mode subject to change in the SSDF. In addition, a damage severity function, derived from the relationship between the MPR for the first mode in the lowest story and the SSDF, is constructed to identify the severity of story stiffness reduction. Furthermore, the locations and severities of multiple damages are identified via the superposition of the presented damage severity functions. The presented method was applied to detect damage in a three-dimensional reinforced concrete (RC) structure.

  11. Numerical investigation of tube hyroforming of TWT using Corner Fill Test

    NASA Astrophysics Data System (ADS)

    Zribi, Temim; Khalfallah, Ali

    2018-05-01

    Tube hydroforming presents a very good alternative to conventional forming processes for obtaining good quality mechanical parts used in several industrial fields, such as the automotive and aerospace sectors. Research in the field of tube hydroforming is aimed at improving the formability, stiffness and weight reduction of manufactured parts using this process. In recent years, a new method of hydroforming appears; it consists of deforming parts made from welded tubes and having different thicknesses. This technique which contributes to the weight reduction of the hydroformed tubes is a good alternative to the conventional tube hydroforming. This technique makes it possible to build rigid and light structures with a reduced cost. However, it is possible to improve the weight reduction by using dissimilar tailor welded tubes (TWT). This paper is a first attempt to analyze by numerical simulation the behavior of TWT hydroformed in square cross section dies, commonly called (Corner Fill Test). Considered tubes are composed of two materials assembled by butt welding. The present analysis will focus on the effect of loading paths on the formability of the structure by determining the change in thickness in several sections of the part. A comparison between the results obtained by hydroforming the butt joint of tubes made of dissimilar materials and those obtained using single-material tube is achieved. Numerical calculations show that the bi-material welded tube has better thinning resistance and a more even thickness distribution in the circumferential directions when compared to the single-material tube.

  12. Self-desiccation mechanism of high-performance concrete.

    PubMed

    Yang, Quan-Bing; Zhang, Shu-Qing

    2004-12-01

    Investigations on the effects of W/C ratio and silica fume on the autogenous shrinkage and internal relative humidity of high performance concrete (HPC), and analysis of the self-desiccation mechanisms of HPC showed that the autogenous shrinkage and internal relative humidity of HPC increases and decreases with the reduction of W/C respectively; and that these phenomena were amplified by the addition of silica fume. Theoretical analyses indicated that the reduction of RH in HPC was not due to shortage of water, but due to the fact that the evaporable water in HPC was not evaporated freely. The reduction of internal relative humidity or the so-called self-desiccation of HPC was chiefly caused by the increase in mole concentration of soluble ions in HPC and the reduction of pore size or the increase in the fraction of micro-pore water in the total evaporable water (T(r)/T(te) ratio).

  13. Biological relevance of CNV calling methods using familial relatedness including monozygotic twins.

    PubMed

    Castellani, Christina A; Melka, Melkaye G; Wishart, Andrea E; Locke, M Elizabeth O; Awamleh, Zain; O'Reilly, Richard L; Singh, Shiva M

    2014-04-21

    Studies involving the analysis of structural variation including Copy Number Variation (CNV) have recently exploded in the literature. Furthermore, CNVs have been associated with a number of complex diseases and neurodevelopmental disorders. Common methods for CNV detection use SNP, CNV, or CGH arrays, where the signal intensities of consecutive probes are used to define the number of copies associated with a given genomic region. These practices pose a number of challenges that interfere with the ability of available methods to accurately call CNVs. It has, therefore, become necessary to develop experimental protocols to test the reliability of CNV calling methods from microarray data so that researchers can properly discriminate biologically relevant data from noise. We have developed a workflow for the integration of data from multiple CNV calling algorithms using the same array results. It uses four CNV calling programs: PennCNV (PC), Affymetrix® Genotyping Console™ (AGC), Partek® Genomics Suite™ (PGS) and Golden Helix SVS™ (GH) to analyze CEL files from the Affymetrix® Human SNP 6.0 Array™. To assess the relative suitability of each program, we used individuals of known genetic relationships. We found significant differences in CNV calls obtained by different CNV calling programs. Although the programs showed variable patterns of CNVs in the same individuals, their distribution in individuals of different degrees of genetic relatedness has allowed us to offer two suggestions. The first involves the use of multiple algorithms for the detection of the largest possible number of CNVs, and the second suggests the use of PennCNV over all other methods when the use of only one software program is desirable.

  14. Effects of implementation of an urgent surgical care service on subspecialty general surgery training

    PubMed Central

    Wood, Leanne; Buczkowski, Andrzej; Panton, Ormond M.N.; Sidhu, Ravi S.; Hameed, S. Morad

    2010-01-01

    Background In July 2007, a large Canadian teaching hospital realigned its general surgery services into elective general surgery subspecialty-based services (SUBS) and a new urgent surgical care (USC) service (also know in the literature as an acute care surgery service). The residents on SUBS had their number of on-call days reduced to enable them to focus on activities related to SUBS. Our aim was to examine the effect of the creation of the USC service on the educational experiences of SUBS residents. Methods We enrolled residents who were on SUBS for the 6 months before and after the introduction of the USC service. We collected data by use of a survey, WEB eVAL and recorded attendance at academic half days. Our 2 primary outcomes were residents’ attendance at ambulatory clinics and compliance with the reduction in the number of on-call days. Our secondary outcomes included residents’ time for independent study, attendance at academic half days, operative experience, attendance at multidisciplinary rounds and overall satisfaction with SUBS. Results Residents on SUBS had a decrease in the mean number of on-call days per resident per month from 6.28 to 1.84 (p = 0.006), an increase in mean attendance at academic half days from 65% to 87% (p = 0.028), at multidisciplinary rounds (p = 0.002) and at ambulatory clinics and an increase in independent reading time (p = 0.015), and they reported an improvement in their work environment. There was no change in the amount of time residents spent in the operating room or in their overall satisfaction with SUBS. Conclusion Residents’ education in the SUBS structure was positively affected by the creation of a USC service. Compliance with the readjustment of on-call duties was high and was identified as the single most significant factor in enabling residents to take full advantage of the unique educational opportunities available only while on SUBS. PMID:20334744

  15. Chronic Disease Management for Tobacco Dependence

    PubMed Central

    Joseph, Anne M.; Fu, Steven S.; Lindgren, Bruce; Rothman, Alexander J.; Kodl, Molly; Lando, Harry; Doyle, Brandon; Hatsukami, Dorothy

    2014-01-01

    Background Tobacco dependence disorder is a chronic relapsing condition, yet treatment is delivered in discrete episodes of care that yield disappointing long-term quit rates. Methods We conducted a randomized controlled trial from June 1, 2004, through May 31, 2009, to compare telephone-based chronic disease management (1 year; longitudinal care [LC]) with evidence-based treatment (8 weeks; usual care [UC]) for tobacco dependence. A total of 443 smokers each received 5 telephone counseling calls and nicotine replacement therapy by mail for 4 weeks. They were then randomized to UC(2 additional calls) or LC(continued counseling and nicotine replacement therapy for an additional 48 weeks). Longitudinal care targeted repeat quit attempts and interim smoking reduction for relapsers. The primary outcome was 6 months of prolonged abstinence measured at 18 months of follow-up. Results At 18 months, 30.2% of LC participants reported 6 months of abstinence from smoking, compared with 23.5% in UC (unadjusted, P=.13). Multivariate analysis showed that LC (adjusted odds ratio, 1.74; 95% CI, 1.08–2.80), quit attempts in past year (1.75; 1.06–2.89), baseline cigarettes per day (0.95; 0.92–0.99), and smoking in the 14- to 21-day interval post-quit (0.23; 0.14–0.38) predicted prolonged abstinence at 18 months. The LC participants who did not quit reduced smoking more than UC participants (significant only at 12 months). The LC participants received more counseling calls than UC participants (mean, 16.5 vs 5.8 calls; P<.001), longer total duration of counseling (283 vs 117 minutes; P<.001), and more nicotine replacement therapy (4.7 vs 2.4 boxes of patches; P<.001). Conclusion A chronic disease management approach increases both short- and long-term abstinence from smoking. Trial Registration clinicaltrials.gov Identifier: NCT00309296 PMID:22123795

  16. SIRE: a MIMO radar for landmine/IED detection

    NASA Astrophysics Data System (ADS)

    Ojowu, Ode; Wu, Yue; Li, Jian; Nguyen, Lam

    2013-05-01

    Multiple-input multiple-output (MIMO) radar systems have been shown to have significant performance improvements over their single-input multiple-output (SIMO) counterparts. For transmit and receive elements that are collocated, the waveform diversity afforded by this radar is exploited for performance improvements. These improvements include but are not limited to improved target detection, improved parameter identifiability and better resolvability. In this paper, we present the Synchronous Impulse Reconstruction Radar (SIRE) Ultra-wideband (UWB) radar designed by the Army Research Lab (ARL) for landmine and improvised explosive device (IED) detection as a 2 by 16 MIMO radar (with collocated antennas). Its improvement over its SIMO counterpart in terms of beampattern/cross range resolution are discussed and demonstrated using simulated data herein. The limitations of this radar for Radio Frequency Interference (RFI) suppression are also discussed in this paper. A relaxation method (RELAX) combined with averaging of multiple realizations of the measured data is presented for RFI suppression; results show no noticeable target signature distortion after suppression. In this paper, the back-projection (delay and sum) data independent method is used for generating SAR images. A side-lobe minimization technique called recursive side-lobe minimization (RSM) is also discussed for reducing side-lobes in this data independent approach. We introduce a data-dependent sparsity based spectral estimation technique called Sparse Learning via Iterative Minimization (SLIM) as well as a data-dependent CLEAN approach for generating SAR images for the SIRE radar. These data-adaptive techniques show improvement in side-lobe reduction and resolution for simulated data for the SIRE radar.

  17. Effect of introduction of electronic patient reporting on the duration of ambulance calls.

    PubMed

    Kuisma, Markku; Väyrynen, Taneli; Hiltunen, Tuomas; Porthan, Kari; Aaltonen, Janne

    2009-10-01

    We examined the effect of the change from paper records to the electronic patient records (EPRs) on ambulance call duration. We retrieved call duration times 6 months before (group 1) and 6 months after (group 2) the introduction of EPR. Subgroup analysis of group 2 was fulfilled depending whether the calls were made during the first or last 3 months after EPR introduction. We analyzed 37 599 ambulance calls (17 950 were in group 1 and 19 649 were in group 2). The median call duration in group 1 was 48 minutes and in group 2 was 49 minutes (P = .008). In group 2, call duration was longer during the first 3 months after EPR introduction. In multiple linear regression analysis, urgency category (P < .0001), unit level (P < .0001), and transportation decision (P < .0001) influenced the call duration. The documentation method was not a significant factor. Electronic patient record system can be implemented in an urban ambulance service in such a way that documentation method does not become a significant factor in determining call duration in the long run. Temporary performance drop during the first 3 months after introduction was noticed, reflecting adaptation process to a new way of working.

  18. Transcriptomic SNP discovery for custom genotyping arrays: impacts of sequence data, SNP calling method and genotyping technology on the probability of validation success.

    PubMed

    Humble, Emily; Thorne, Michael A S; Forcada, Jaume; Hoffman, Joseph I

    2016-08-26

    Single nucleotide polymorphism (SNP) discovery is an important goal of many studies. However, the number of 'putative' SNPs discovered from a sequence resource may not provide a reliable indication of the number that will successfully validate with a given genotyping technology. For this it may be necessary to account for factors such as the method used for SNP discovery and the type of sequence data from which it originates, suitability of the SNP flanking sequences for probe design, and genomic context. To explore the relative importance of these and other factors, we used Illumina sequencing to augment an existing Roche 454 transcriptome assembly for the Antarctic fur seal (Arctocephalus gazella). We then mapped the raw Illumina reads to the new hybrid transcriptome using BWA and BOWTIE2 before calling SNPs with GATK. The resulting markers were pooled with two existing sets of SNPs called from the original 454 assembly using NEWBLER and SWAP454. Finally, we explored the extent to which SNPs discovered using these four methods overlapped and predicted the corresponding validation outcomes for both Illumina Infinium iSelect HD and Affymetrix Axiom arrays. Collating markers across all discovery methods resulted in a global list of 34,718 SNPs. However, concordance between the methods was surprisingly poor, with only 51.0 % of SNPs being discovered by more than one method and 13.5 % being called from both the 454 and Illumina datasets. Using a predictive modeling approach, we could also show that SNPs called from the Illumina data were on average more likely to successfully validate, as were SNPs called by more than one method. Above and beyond this pattern, predicted validation outcomes were also consistently better for Affymetrix Axiom arrays. Our results suggest that focusing on SNPs called by more than one method could potentially improve validation outcomes. They also highlight possible differences between alternative genotyping technologies that could be explored in future studies of non-model organisms.

  19. Reinforcement learning for resource allocation in LEO satellite networks.

    PubMed

    Usaha, Wipawee; Barria, Javier A

    2007-06-01

    In this paper, we develop and assess online decision-making algorithms for call admission and routing for low Earth orbit (LEO) satellite networks. It has been shown in a recent paper that, in a LEO satellite system, a semi-Markov decision process formulation of the call admission and routing problem can achieve better performance in terms of an average revenue function than existing routing methods. However, the conventional dynamic programming (DP) numerical solution becomes prohibited as the problem size increases. In this paper, two solution methods based on reinforcement learning (RL) are proposed in order to circumvent the computational burden of DP. The first method is based on an actor-critic method with temporal-difference (TD) learning. The second method is based on a critic-only method, called optimistic TD learning. The algorithms enhance performance in terms of requirements in storage, computational complexity and computational time, and in terms of an overall long-term average revenue function that penalizes blocked calls. Numerical studies are carried out, and the results obtained show that the RL framework can achieve up to 56% higher average revenue over existing routing methods used in LEO satellite networks with reasonable storage and computational requirements.

  20. Sports Training Support Method by Self-Coaching with Humanoid Robot

    NASA Astrophysics Data System (ADS)

    Toyama, S.; Ikeda, F.; Yasaka, T.

    2016-09-01

    This paper proposes a new training support method called self-coaching with humanoid robots. In the proposed method, two small size inexpensive humanoid robots are used because of their availability. One robot called target robot reproduces motion of a target player and another robot called reference robot reproduces motion of an expert player. The target player can recognize a target technique from the reference robot and his/her inadequate skill from the target robot. Modifying the motion of the target robot as self-coaching, the target player could get advanced cognition. Some experimental results show some possibility as the new training method and some issues of the self-coaching interface program as a future work.

  1. Effects of a novel method for enteral nutrition infusion involving a viscosity-regulating pectin solution: A multicenter randomized controlled trial.

    PubMed

    Tabei, Isao; Tsuchida, Shigeru; Akashi, Tetsuro; Ookubo, Katsuichiro; Hosoda, Satoru; Furukawa, Yoshiyuki; Tanabe, Yoshiaki; Tamura, Yoshiko

    2018-02-01

    The initial complications associated with infusion of enteral nutrition (EN) for clinical and nutritional care are vomiting, aspiration pneumonia, and diarrhea. There are many recommendations to prevent these complications. A novel method involving a viscosity-regulating pectin solution has been demonstrated. In Japan, this method along with the other so-called "semi-solid EN" approaches has been widely used in practice. However, there has been no randomized clinical trial to prove the efficiency and safety of a viscosity-regulating pectin solution in EN management. Therefore, we planned and initiated a multicenter randomized controlled trial to determine the efficiency and safety. This study included 34 patients from 7 medical institutions who participated. Institutional review board (IRB) approval was obtained from all participating institutions. Patients who required EN management were enrolled and randomly assigned to the viscosity regulation of enteral feeding (VREF) group and control group. The VREF group (n = 15) was managed with the addition of a viscosity-regulating pectin solution. The control group (n = 12) was managed with conventional EN administration, usually in a gradual step-up method. Daily clinical symptoms of pneumonia, fever, vomiting, and diarrhea; defecation frequency; and stool form were observed in the 2 week trial period. The dose of EN and duration of infusion were also examined. A favorable trend for clinical symptoms was noticed in the VREF group. No significant differences were observed in episodes of pneumonia, fever, vomiting, and diarrhea between the 2 groups. An apparent reduction in infusion duration and hardening of stool form were noted in the VREF group. The novel method involving a viscosity-regulating pectin solution with EN administration can be clinically performed safely and efficiently, similar to the conventional method. Moreover, there were benefits, such as improvement in stool form, a short time for EN infusion, and a reduction in vomiting episodes, with the use of the novel method. This indicates some potential advantages in the quality of life among patients receiving this novel method. Copyright © 2017 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.

  2. Intra- and interspecific responses to Rafinesque’s big-eared bat (Corynorhinus rafinesquii) social calls

    Treesearch

    S. Loeb; E. Britzke

    2010-01-01

    Bats respond to the calls of conspecifics as well as to calls of other species; however, few studies have attempted to quantify these responses or understand the functions of these calls. We tested the response of Rafinesque’s big-eared bats (Corynorhinus rafinesquii) to social calls as a possible method to increase capture success and to understand the function of...

  3. Axisymmetric drop shape analysis for estimating the surface tension of cell aggregates by centrifugation.

    PubMed

    Kalantarian, Ali; Ninomiya, Hiromasa; Saad, Sameh M I; David, Robert; Winklbauer, Rudolf; Neumann, A Wilhelm

    2009-02-18

    Biological tissues behave in certain respects like liquids. Consequently, the surface tension concept can be used to explain aspects of the in vitro and in vivo behavior of multicellular aggregates. Unfortunately, conventional methods of surface tension measurement cannot be readily applied to small cell aggregates. This difficulty can be overcome by an experimentally straightforward method consisting of centrifugation followed by axisymmetric drop shape analysis (ADSA). Since the aggregates typically show roughness, standard ADSA cannot be applied and we introduce a novel numerical method called ADSA-IP (ADSA for imperfect profile) for this purpose. To examine the new methodology, embryonic tissues from the gastrula of the frog, Xenopus laevis, deformed in the centrifuge are used. It is confirmed that surface tension measurements are independent of centrifugal force and aggregate size. Surface tension is measured for ectodermal cells in four sample batches, and varies between 1.1 and 7.7 mJ/m2. Surface tension is also measured for aggregates of cells expressing cytoplasmically truncated EP/C-cadherin, and is approximately half as large. In parallel, such aggregates show a reduction in convergent extension-driven elongation after activin treatment, reflecting diminished intercellular cohesion.

  4. Clickers in the large classroom: current research and best-practice tips.

    PubMed

    Caldwell, Jane E

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users.

  5. Clickers in the Large Classroom: Current Research and Best-Practice Tips

    PubMed Central

    2007-01-01

    Audience response systems (ARS) or clickers, as they are commonly called, offer a management tool for engaging students in the large classroom. Basic elements of the technology are discussed. These systems have been used in a variety of fields and at all levels of education. Typical goals of ARS questions are discussed, as well as methods of compensating for the reduction in lecture time that typically results from their use. Examples of ARS use occur throughout the literature and often detail positive attitudes from both students and instructors, although exceptions do exist. When used in classes, ARS clickers typically have either a benign or positive effect on student performance on exams, depending on the method and extent of their use, and create a more positive and active atmosphere in the large classroom. These systems are especially valuable as a means of introducing and monitoring peer learning methods in the large lecture classroom. So that the reader may use clickers effectively in his or her own classroom, a set of guidelines for writing good questions and a list of best-practice tips have been culled from the literature and experienced users. PMID:17339389

  6. Analyzing brain networks with PCA and conditional Granger causality.

    PubMed

    Zhou, Zhenyu; Chen, Yonghong; Ding, Mingzhou; Wright, Paul; Lu, Zuhong; Liu, Yijun

    2009-07-01

    Identifying directional influences in anatomical and functional circuits presents one of the greatest challenges for understanding neural computations in the brain. Granger causality mapping (GCM) derived from vector autoregressive models of data has been employed for this purpose, revealing complex temporal and spatial dynamics underlying cognitive processes. However, the traditional GCM methods are computationally expensive, as signals from thousands of voxels within selected regions of interest (ROIs) are individually processed, and being based on pairwise Granger causality, they lack the ability to distinguish direct from indirect connectivity among brain regions. In this work a new algorithm called PCA based conditional GCM is proposed to overcome these problems. The algorithm implements the following two procedures: (i) dimensionality reduction in ROIs of interest with principle component analysis (PCA), and (ii) estimation of the direct causal influences in local brain networks, using conditional Granger causality. Our results show that the proposed method achieves greater accuracy in detecting network connectivity than the commonly used pairwise Granger causality method. Furthermore, the use of PCA components in conjunction with conditional GCM greatly reduces the computational cost relative to the use of individual voxel time series. Copyright 2009 Wiley-Liss, Inc

  7. Recognition and characterization of hierarchical interstellar structure. II - Structure tree statistics

    NASA Technical Reports Server (NTRS)

    Houlahan, Padraig; Scalo, John

    1992-01-01

    A new method of image analysis is described, in which images partitioned into 'clouds' are represented by simplified skeleton images, called structure trees, that preserve the spatial relations of the component clouds while disregarding information concerning their sizes and shapes. The method can be used to discriminate between images of projected hierarchical (multiply nested) and random three-dimensional simulated collections of clouds constructed on the basis of observed interstellar properties, and even intermediate systems formed by combining random and hierarchical simulations. For a given structure type, the method can distinguish between different subclasses of models with different parameters and reliably estimate their hierarchical parameters: average number of children per parent, scale reduction factor per level of hierarchy, density contrast, and number of resolved levels. An application to a column density image of the Taurus complex constructed from IRAS data is given. Moderately strong evidence for a hierarchical structural component is found, and parameters of the hierarchy, as well as the average volume filling factor and mass efficiency of fragmentation per level of hierarchy, are estimated. The existence of nested structure contradicts models in which large molecular clouds are supposed to fragment, in a single stage, into roughly stellar-mass cores.

  8. Soft-Bake Purification of SWCNTs Produced by Pulsed Laser Vaporization

    NASA Technical Reports Server (NTRS)

    Yowell, Leonard; Nikolaev, Pavel; Gorelik, Olga; Allada, Rama Kumar; Sosa, Edward; Arepalli, Sivaram

    2013-01-01

    The "soft-bake" method is a simple and reliable initial purification step first proposed by researchers at Rice University for single-walled carbon nanotubes (SWCNT) produced by high-pressure carbon mon oxide disproportionation (HiPco). Soft-baking consists of annealing as-produced (raw) SWCNT, at low temperatures in humid air, in order to degrade the heavy graphitic shells that surround metal particle impurities. Once these shells are cracked open by the expansion and slow oxidation of the metal particles, the metal impurities can be digested through treatment with hydrochloric acid. The soft-baking of SWCNT produced by pulsed-laser vaporization (PLV) is not straightforward, because the larger average SWCNT diameters (.1.4 nm) and heavier graphitic shells surrounding metal particles call for increased temperatures during soft-bake. A part of the technology development focused on optimizing the temperature so that effective cracking of the graphitic shells is balanced with maintaining a reasonable yield, which was a critical aspect of this study. Once the ideal temperature was determined, a number of samples of raw SWCNT were purified using the soft-bake method. An important benefit to this process is the reduced time and effort required for soft-bake versus the standard purification route for SWCNT. The total time spent purifying samples by soft-bake is one week per batch, which equates to a factor of three reduction in the time required for purification as compared to the standard acid purification method. Reduction of the number of steps also appears to be an important factor in improving reproducibility of yield and purity of SWCNT, as small deviations are likely to get amplified over the course of a complicated multi-step purification process.

  9. Spatial-temporal forecasting the sunspot diagram

    NASA Astrophysics Data System (ADS)

    Covas, Eurico

    2017-09-01

    Aims: We attempt to forecast the Sun's sunspot butterfly diagram in both space (I.e. in latitude) and time, instead of the usual one-dimensional time series forecasts prevalent in the scientific literature. Methods: We use a prediction method based on the non-linear embedding of data series in high dimensions. We use this method to forecast both in latitude (space) and in time, using a full spatial-temporal series of the sunspot diagram from 1874 to 2015. Results: The analysis of the results shows that it is indeed possible to reconstruct the overall shape and amplitude of the spatial-temporal pattern of sunspots, but that the method in its current form does not have real predictive power. We also apply a metric called structural similarity to compare the forecasted and the observed butterfly cycles, showing that this metric can be a useful addition to the usual root mean square error metric when analysing the efficiency of different prediction methods. Conclusions: We conclude that it is in principle possible to reconstruct the full sunspot butterfly diagram for at least one cycle using this approach and that this method and others should be explored since just looking at metrics such as sunspot count number or sunspot total area coverage is too reductive given the spatial-temporal dynamical complexity of the sunspot butterfly diagram. However, more data and/or an improved approach is probably necessary to have true predictive power.

  10. Measuring the effectiveness of patient-chosen reminder methods in a private orthodontic practice.

    PubMed

    Wegrzyniak, Lauren M; Hedderly, Deborah; Chaudry, Kishore; Bollu, Prashanti

    2018-05-01

    To evaluate the effectiveness of patient-chosen appointment reminder methods (phone call, e-mail, or SMS text) in reducing no-show rates. This was a retrospective case study that determined the correlation between patient-chosen appointment reminder methods and no-show rates in a private orthodontic practice. This study was conducted in a single office location of a multioffice private orthodontic practice using data gathered in 2015. The subjects were patients who self-selected the appointment reminder method (phone call, e-mail, or SMS text). Patient appointment data were collected over a 6-month period. Patient attendance was analyzed with descriptive statistics to determine any significant differences among patient-chosen reminder methods. There was a total of 1193 appointments with an average no-show rate of 2.43% across the three reminder methods. No statistically significant differences ( P = .569) were observed in the no-show rates between the three methods: phone call (3.49%), e-mail (2.68%), and SMS text (1.90%). The electronic appointment reminder methods (SMS text and e-mail) had lower no-show rates compared with the phone call method, with SMS text having the lowest no-show rate of 1.90%. However, since no significant differences were observed between the three patient-chosen reminder methods, providers may want to allow patients to choose their reminder method to decrease no-shows.

  11. Spatiotemporal analysis of prior appropriations water calls

    NASA Astrophysics Data System (ADS)

    Elbakidze, Levan; Shen, Xiaozhe; Taylor, Garth; Mooney, SiâN.

    2012-06-01

    A spatiotemporal model is developed to examine prior appropriations-based water curtailment in Idaho's Snake River Plain Aquifer. Using a 100 year horizon, prior appropriations-based curtailment supplemented with optimized water use reductions is shown to produce a spatial distribution of water use reductions that differs from that produced by regulatory curtailment based strictly on initial water right assignments. Discounted profits over 100 years of crop production are up to 7% higher when allocation is optimized. Total pumping over 100 years is 0.3%, 3%, and 40% higher under 1, 10, and 100 year prior appropriations-based regulatory curtailment, respectively.

  12. X-ray Crystal Truncation Rod Studies of Surface Oxidation and Reduction on Pt(111)

    DOE PAGES

    Liu, Yihua; Barbour, Andi; Komanicky, Vladimir; ...

    2016-02-26

    Here, we present X-ray crystal truncation rods measurements of Pt(111) surface under electrochemical conditions. Analyses of crystal truncation rods reveal that surface oxide formation buckles the top surface layer of platinum to two different heights at the potential (0.95 V vs RHE) below the so-called place-exchange potential. While the anti-Bragg intensity, sensitive to the top surface layer, drops in response to the anodic charge transfers, its responses to the cathodic charge transfers are significantly delayed. Implications to the surface oxidation and reduction behaviors are discussed.

  13. Elliptic Painlevé equations from next-nearest-neighbor translations on the E_8^{(1)} lattice

    NASA Astrophysics Data System (ADS)

    Joshi, Nalini; Nakazono, Nobutaka

    2017-07-01

    The well known elliptic discrete Painlevé equation of Sakai is constructed by a standard translation on the E_8(1) lattice, given by nearest neighbor vectors. In this paper, we give a new elliptic discrete Painlevé equation obtained by translations along next-nearest-neighbor vectors. This equation is a generic (8-parameter) version of a 2-parameter elliptic difference equation found by reduction from Adler’s partial difference equation, the so-called Q4 equation. We also provide a projective reduction of the well known equation of Sakai.

  14. 75 FR 58391 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-24

    ... requirement of Section 3506(c)(2)(A) of the Paperwork Reduction Act of 1995 for opportunity for public comment... Brief Description Cyanobacteria (also called blue-green algae) can be found in terrestrial, fresh, brackish, or marine water environments. Some species of cyanobacteria produce toxins that may cause acute...

  15. Application of Local Linear Embedding to Nonlinear Exploratory Latent Structure Analysis

    ERIC Educational Resources Information Center

    Wang, Haonan; Iyer, Hari

    2007-01-01

    In this paper we discuss the use of a recent dimension reduction technique called Locally Linear Embedding, introduced by Roweis and Saul, for performing an exploratory latent structure analysis. The coordinate variables from the locally linear embedding describing the manifold on which the data reside serve as the latent variable scores. We…

  16. Wireless Drop Tower for Microgravity Demonstrations. Educational Brief.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Washington, DC.

    Microgravity-the absence or reduction of some of the effects of gravity-is an important attribute of free-fall. In microgravity (often incorrectly called zero-g), water no longer flows "downhill" and neither do smoke or steam bubbles rise. This changes a number of chemical and physical activities. Experiments in combustion, fluid behavior,…

  17. Developmental decline in height growth in Douglas-fir.

    Treesearch

    Barbara J. Bond; Nicole M. Czarnomski; Clifton Cooper; Michael E. Day; Michael S. Greenwood

    2007-01-01

    The characteristic decline in height growth that occurs over a tree's lifespan is often called "age-related decline." But is the reduction in height growth in aging trees a function of age or of size? We grafted shoot tips across different ages and sizes of Douglas-fir (Pseudotsuga menziesii (Mirb.) Franco) trees to determine whether...

  18. 76 FR 28792 - Agency Information Collection Activities: Proposed Collection: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Health Resources and Services Administration Agency... States Code, as amended by the Paperwork Reduction Act of 1995, Pub. L. 104-13), the Health Resources and... instruments, e-mail [email protected] or call the HRSA Reports Clearance Officer at (301) 443-1129. Comments...

  19. Are Management-Based Regulations Effective? Evidence from State Pollution Prevention Programs

    ERIC Educational Resources Information Center

    Bennear, Lori Snyder

    2007-01-01

    This paper evaluates a recent innovation in regulating risk called management-based regulation. Traditionally, risk regulation has either specified a particular means of achieving a risk-reduction goal or specified the goal and left the means of achieving that goal up to the regulated entity. In contrast, management-based regulation neither…

  20. 76 FR 22156 - Agency Information Collection Activities: Submission to OMB for Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-20

    ... Management and Budget (OMB) for review and clearance under the Paperwork Reduction Act of 1995 (Public Law... collection of information: Title: Corporate Credit Union Monthly Call Report. OMB Number: 3133-0067. Form.... Description: NCUA utilizes the information to monitor financial conditions in corporate credit unions, and to...

  1. Frank Westheimer's Early Demonstration of Enzymatic Specificity

    ERIC Educational Resources Information Center

    Ault, Addison

    2008-01-01

    In this article I review one of the most significant accomplishments of Frank H. Westheimer, one of the most respected chemists of the 20th century. This accomplishment was a series of stereospecific enzymatic oxidation and reduction experiments that led chemists to recognize what we now call the enantiotopic and diastereotopic relationships of…

  2. Condomless Sex: Gay Men, Barebacking, and Harm Reduction

    ERIC Educational Resources Information Center

    Shernoff, Michael

    2006-01-01

    Social science research as well as a rise in sexually transmitted diseases and new HIV infections among men who have sex with men point to increasing numbers of gay men engaging in unprotected anal intercourse without condoms, a practice called "barebacking." There is some evidence that barebacking is linked to the rise of crystal methamphetamine…

  3. 50 CFR 229.36 - Atlantic Pelagic Longline Take Reduction Plan (PLTRP).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... information on marine mammal interactions, fishing operations, marine mammal life history information, and..., or intend to do so, you must call NMFS Southeast Fisheries Science Center (SEFSC), 1-888-254-2558, at... from the Director, NMFS Southeast Fishery Science Center to use a pelagic longline exceeding 20 nm (37...

  4. 77 FR 52334 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... these requests, call the CDC Reports Clearance Officer at (404) 639-7570 or send an email to [email protected] strategies to reduce teen pregnancy; and (4) supporting the sustainability of the community-wide teen... performance information will be reported to CDC annually. Training and technical assistance needs will be...

  5. 78 FR 40742 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... these requests, call the CDC Reports Clearance Officer at (404) 639-7570 or send an email to [email protected] single episode of violence to ongoing battering; many victims do not report IPV to police, friends, or... the public health approach; and sustainability of prevention activities and successes. The DF Survey...

  6. 78 FR 36230 - 60-Day Notice of Proposed Information Collection: FHA-Insured Mortgage Loan Servicing of Payments...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-17

    ... described below. In accordance with the Paperwork Reduction Act, HUD is requesting comment from all... the proposed forms or other available information. Persons with hearing or speech impairments may... with hearing or speech impairments may access this number through TTY by calling the toll-free Federal...

  7. 50 CFR 229.36 - Atlantic Pelagic Longline Take Reduction Plan (PLTRP).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... information on marine mammal interactions, fishing operations, marine mammal life history information, and..., or intend to do so, you must call NMFS Southeast Fisheries Science Center (SEFSC), 1-888-254-2558, at... from the Director, NMFS Southeast Fishery Science Center to use a pelagic longline exceeding 20 nm (37...

  8. 75 FR 67373 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ... these requests, call the CDC Reports Clearance Officer at (404) 639-5960 or send an e-mail to [email protected] of human disease. The revisions to the ``Application for Permit to Import or Transport Etiologic... day- to-day processing of these forms. The ``Application for Permit to Import or Transport Live Bats...

  9. 75 FR 32749 - Information Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-09

    ... provisions of the Paperwork Reduction Act (44 U.S.C. Chapter 35). A shortened comment period of one week is... response rates require a 2 month field period, and analysis and summary of data requires a month time... Comprehensive Review Working Group, Crystal Mall 2, 1801 S. Bell St., Suite 409, Arlington, VA; or call (703...

  10. 50 CFR 229.36 - Atlantic Pelagic Longline Take Reduction Plan (PLTRP).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... information on marine mammal interactions, fishing operations, marine mammal life history information, and..., or intend to do so, you must call NMFS Southeast Fisheries Science Center (SEFSC), 1-888-254-2558, at... from the Director, NMFS Southeast Fishery Science Center to use a pelagic longline exceeding 20 nm (37...

  11. 50 CFR 229.36 - Atlantic Pelagic Longline Take Reduction Plan (PLTRP).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... information on marine mammal interactions, fishing operations, marine mammal life history information, and..., or intend to do so, you must call NMFS Southeast Fisheries Science Center (SEFSC), 1-888-254-2558, at... from the Director, NMFS Southeast Fishery Science Center to use a pelagic longline exceeding 20 nm (37...

  12. 50 CFR 229.36 - Atlantic Pelagic Longline Take Reduction Plan (PLTRP).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... information on marine mammal interactions, fishing operations, marine mammal life history information, and..., or intend to do so, you must call NMFS Southeast Fisheries Science Center (SEFSC), 1-888-254-2558, at... from the Director, NMFS Southeast Fishery Science Center to use a pelagic longline exceeding 20 nm (37...

  13. Dimension Reduction Near Periodic Orbits of Hybrid Systems: Appendix

    DTIC Science & Technology

    2011-09-07

    aplicable to a class of non-smooth systems called hybrid dynamical systems. We relegate a formal definition of the class of hybrid systems under...and highly mobile hexapod robot. IJRR, 20(7):616, 2001. [4] S. Kim, J.E. Clark, and M.R. Cutkosky. iSprawl: Design and tuning for high-speed

  14. LINKING CHANGES IN UTILITY NO X EMISSIONS TO CHANGE IN OZONE AIR QUALITY

    EPA Science Inventory

    The NOx State Implementation Plan (SIP) Call was designed to reduce Northeastern U.S. NOx emissions from utilities. With these reductions, it was anticipated that the amount of ozone attributed to transport from other states would in turn be reduced. In th...

  15. 76 FR 51984 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... promotional activities and permission to extract de-identified QL call volume data from the National Quitline Data Warehouse (NQDW, OMB No. 0920-0856, exp. 7/31/2012). Information will be transmitted to CDC on a... client intake data, i.e., information obtained from clients when they request tobacco cessation services...

  16. Reduction in bearing size due to superconductors in magnetic bearings

    NASA Technical Reports Server (NTRS)

    Rao, Dantam K.; Lewis, Paul; Dill, James F.

    1991-01-01

    A design concept that reduces the size of magnetic bearings is assessed. The small size will enable magnetic bearings to fit into limited available bearing volume of cryogenic machinery. The design concept, called SUPERC, uses (high Tc) superconductors or high-purity aluminum conductors in windings instead of copper. The relatively high-current density of these conductors reduces the slot radial thickness for windings, which reduces the size of the bearings. MTI developed a sizing program called SUPERC that translates the high-current density of these conductors into smaller sized bearings. This program was used to size a superconducting bearing to carry a 500 lb. load. The sizes of magnetic bearings needed by various design concepts are as follows: SUPERC design concept = 3.75 in.; magnet-bias design concept = 5.25 in.; and all electromagnet design concept = 7.0 in. These results indicate that the SUPERC design concept can significantly reduce the size of the bearing. This reduction, in turn, reduces the weight and yields a lighter bearing. Since the superconductors have inherently near-zero resistance, they are also expected to save power needed for operation considerably.

  17. The Probability of Hitting a Polygonal Target

    DTIC Science & Technology

    1981-04-01

    required for the use of this method for coalputing the probability of hitting d polygonal target. These functions are 1. PHIT (called by user’s main progran...2. FIJ (called by PHIT ) 3. FUN (called by FIJ) The user must include all three of these in his main program, but needs only to call PHIT . The

  18. The International Society of Hypertension and World Hypertension League call on governments, nongovernmental organizations and the food industry to work to reduce dietary sodium.

    PubMed

    Campbell, Norman R C; Lackland, Daniel T; Chockalingam, Arun; Lisheng, Liu; Harrap, Stephen B; Touyz, Rhian M; Burrell, Louise M; Ramírez, Agustín J; Schmieder, Roland E; Schutte, Aletta E; Prabhakaran, Dorairaj; Schiffrin, Ernesto L

    2014-02-01

    The International Society of Hypertension and the World Hypertension League have developed a policy statement calling for reducing dietary salt. The policy supports the WHO and the United Nations recommendations, which are based on a comprehensive and up-to-date review of relevant research. The policy statement calls for broad societal action to reduce dietary salt, thus reducing blood pressure and preventing hypertension and its related burden of cardiovascular disease. The hypertension organizations and experts need to become more engaged in the efforts to prevent hypertension and to advocate strongly to have dietary salt reduction policies implemented. The statement is being circulated to national hypertension organizations and to international nongovernmental health organizations for consideration of endorsement. Member organizations of the International Society of Hypertension and the World Hypertension League are urged to support this effort.

  19. Design of real-time voice over internet protocol system under bandwidth network

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Gong, Lina

    2017-04-01

    With the increasing bandwidth of the network and network convergence accelerating, VoIP means of communication across the network is becoming increasingly popular phenomenon. The real-time identification and analysis for VOIP flow over backbone network become the urgent needs and research hotspot of network operations management. Based on this, the paper proposes a VoIP business management system over backbone network. The system first filters VoIP data stream over backbone network and further resolves the call signaling information and media voice. The system can also be able to design appropriate rules to complete real-time reduction and presentation of specific categories of calls. Experimental results show that the system can parse and process real-time backbone of the VoIP call, and the results are presented accurately in the management interface, VoIP-based network traffic management and maintenance provide the necessary technical support.

  20. Shock Simulations of Single-Site Coarse-Grain RDX using the Dissipative Particle Dynamics Method with Reactivity

    NASA Astrophysics Data System (ADS)

    Sellers, Michael; Lisal, Martin; Schweigert, Igor; Larentzos, James; Brennan, John

    2015-06-01

    In discrete particle simulations, when an atomistic model is coarse-grained, a trade-off is made: a boost in computational speed for a reduction in accuracy. Dissipative Particle Dynamics (DPD) methods help to recover accuracy in viscous and thermal properties, while giving back a small amount of computational speed. One of the most notable extensions of DPD has been the introduction of chemical reactivity, called DPD-RX. Today, pairing the current evolution of DPD-RX with a coarse-grained potential and its chemical decomposition reactions allows for the simulation of the shock behavior of energetic materials at a timescale faster than an atomistic counterpart. In 2007, Maillet et al. introduced implicit chemical reactivity in DPD through the concept of particle reactors and simulated the decomposition of liquid nitromethane. We have recently extended the DPD-RX method and have applied it to solid hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) under shock conditions using a recently developed single-site coarse-grain model and a reduced RDX decomposition mechanism. A description of the methods used to simulate RDX and its tranition to hot product gases within DPD-RX will be presented. Additionally, examples of the effect of microstructure on shock behavior will be shown. Approved for public release. Distribution is unlimited.

  1. ERASE-Seq: Leveraging replicate measurements to enhance ultralow frequency variant detection in NGS data

    PubMed Central

    Kamps-Hughes, Nick; McUsic, Andrew; Kurihara, Laurie; Harkins, Timothy T.; Pal, Prithwish; Ray, Claire

    2018-01-01

    The accurate detection of ultralow allele frequency variants in DNA samples is of interest in both research and medical settings, particularly in liquid biopsies where cancer mutational status is monitored from circulating DNA. Next-generation sequencing (NGS) technologies employing molecular barcoding have shown promise but significant sensitivity and specificity improvements are still needed to detect mutations in a majority of patients before the metastatic stage. To address this we present analytical validation data for ERASE-Seq (Elimination of Recurrent Artifacts and Stochastic Errors), a method for accurate and sensitive detection of ultralow frequency DNA variants in NGS data. ERASE-Seq differs from previous methods by creating a robust statistical framework to utilize technical replicates in conjunction with background error modeling, providing a 10 to 100-fold reduction in false positive rates compared to published molecular barcoding methods. ERASE-Seq was tested using spiked human DNA mixtures with clinically realistic DNA input quantities to detect SNVs and indels between 0.05% and 1% allele frequency, the range commonly found in liquid biopsy samples. Variants were detected with greater than 90% sensitivity and a false positive rate below 0.1 calls per 10,000 possible variants. The approach represents a significant performance improvement compared to molecular barcoding methods and does not require changing molecular reagents. PMID:29630678

  2. Probability of detecting band-tailed pigeons during call-broadcast versus auditory surveys

    USGS Publications Warehouse

    Kirkpatrick, C.; Conway, C.J.; Hughes, K.M.; Devos, J.C.

    2007-01-01

    Estimates of population trend for the interior subspecies of band-tailed pigeon (Patagioenas fasciata fasciata) are not available because no standardized survey method exists for monitoring the interior subspecies. We evaluated 2 potential band-tailed pigeon survey methods (auditory and call-broadcast surveys) from 2002 to 2004 in 5 mountain ranges in southern Arizona, USA, and in mixed-conifer forest throughout the state. Both auditory and call-broadcast surveys produced low numbers of cooing pigeons detected per survey route (x?? ??? 0.67) and had relatively high temporal variance in average number of cooing pigeons detected during replicate surveys (CV ??? 161%). However, compared to auditory surveys, use of call-broadcast increased 1) the percentage of replicate surveys on which ???1 cooing pigeon was detected by an average of 16%, and 2) the number of cooing pigeons detected per survey route by an average of 29%, with this difference being greatest during the first 45 minutes of the morning survey period. Moreover, probability of detecting a cooing pigeon was 27% greater during call-broadcast (0.80) versus auditory (0.63) surveys. We found that cooing pigeons were most common in mixed-conifer forest in southern Arizona and density of male pigeons in mixed-conifer forest throughout the state averaged 0.004 (SE = 0.001) pigeons/ha. Our results are the first to show that call-broadcast increases the probability of detecting band-tailed pigeons (or any species of Columbidae) during surveys. Call-broadcast surveys may provide a useful method for monitoring populations of the interior subspecies of band-tailed pigeon in areas where other survey methods are inappropriate.

  3. The subtle business of model reduction for stochastic chemical kinetics

    NASA Astrophysics Data System (ADS)

    Gillespie, Dan T.; Cao, Yang; Sanft, Kevin R.; Petzold, Linda R.

    2009-02-01

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S1⇌S2→S3, whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S3-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  4. The subtle business of model reduction for stochastic chemical kinetics.

    PubMed

    Gillespie, Dan T; Cao, Yang; Sanft, Kevin R; Petzold, Linda R

    2009-02-14

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S(1)<=>S(2)-->S(3), whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S(3)-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  5. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  6. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  7. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  8. 47 CFR 52.21 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... subscriber calls. (e) The term database method means a number portability method that utilizes one or more external databases for providing called party routing information. (f) The term downstream database means a database owned and operated by an individual carrier for the purpose of providing number portability in...

  9. Calling depths of baleen whales from single sensor data: development of an autocorrelation method using multipath localization.

    PubMed

    Valtierra, Robert D; Glynn Holt, R; Cholewiak, Danielle; Van Parijs, Sofie M

    2013-09-01

    Multipath localization techniques have not previously been applied to baleen whale vocalizations due to difficulties in application to tonal vocalizations. Here it is shown that an autocorrelation method coupled with the direct reflected time difference of arrival localization technique can successfully resolve location information. A derivation was made to model the autocorrelation of a direct signal and its overlapping reflections to illustrate that an autocorrelation may be used to extract reflection information from longer duration signals containing a frequency sweep, such as some calls produced by baleen whales. An analysis was performed to characterize the difference in behavior of the autocorrelation when applied to call types with varying parameters (sweep rate, call duration). The method's feasibility was tested using data from playback transmissions to localize an acoustic transducer at a known depth and location. The method was then used to estimate the depth and range of a single North Atlantic right whale (Eubalaena glacialis) and humpback whale (Megaptera novaeangliae) from two separate experiments.

  10. Increasing Medicaid child health screenings: the effectiveness of mailed pamphlets, phone calls, and home visits.

    PubMed Central

    Selby-Harrington, M; Sorenson, J R; Quade, D; Stearns, S C; Tesh, A S; Donat, P L

    1995-01-01

    OBJECTIVES. A randomized controlled trial was conducted to test the effectiveness and cost effectiveness of three outreach interventions to promote well-child screening for children on Medicaid. METHODS. In rural North Carolina, a random sample of 2053 families with children due or overdue for screening was stratified according to the presence of a home phone. Families were randomly assigned to receive a mailed pamphlet and letter, a phone call, or a home visit outreach intervention, or the usual (control) method of informing at Medicaid intake. RESULTS. All interventions produced more screenings than the control method, but increases were significant only for families with phones. Among families with phones, a home visit was the most effective intervention but a phone call was the most cost-effective. However, absolute rates of effectiveness were low, and incremental costs per effect were high. CONCLUSIONS. Pamphlets, phone calls, and home visits by nurses were minimally effective for increasing well-child screenings. Alternate outreach methods are needed, especially for families without phones. PMID:7573627

  11. A new randomized Kaczmarz based kernel canonical correlation analysis algorithm with applications to information retrieval.

    PubMed

    Cai, Jia; Tang, Yi

    2018-02-01

    Canonical correlation analysis (CCA) is a powerful statistical tool for detecting the linear relationship between two sets of multivariate variables. Kernel generalization of it, namely, kernel CCA is proposed to describe nonlinear relationship between two variables. Although kernel CCA can achieve dimensionality reduction results for high-dimensional data feature selection problem, it also yields the so called over-fitting phenomenon. In this paper, we consider a new kernel CCA algorithm via randomized Kaczmarz method. The main contributions of the paper are: (1) A new kernel CCA algorithm is developed, (2) theoretical convergence of the proposed algorithm is addressed by means of scaled condition number, (3) a lower bound which addresses the minimum number of iterations is presented. We test on both synthetic dataset and several real-world datasets in cross-language document retrieval and content-based image retrieval to demonstrate the effectiveness of the proposed algorithm. Numerical results imply the performance and efficiency of the new algorithm, which is competitive with several state-of-the-art kernel CCA methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. A Low-Cost Method for Multiple Disease Prediction.

    PubMed

    Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea

    Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.

  13. Mapping DNA methylation by transverse current sequencing: Reduction of noise from neighboring nucleotides

    NASA Astrophysics Data System (ADS)

    Alvarez, Jose; Massey, Steven; Kalitsov, Alan; Velev, Julian

    Nanopore sequencing via transverse current has emerged as a competitive candidate for mapping DNA methylation without needed bisulfite-treatment, fluorescent tag, or PCR amplification. By eliminating the error producing amplification step, long read lengths become feasible, which greatly simplifies the assembly process and reduces the time and the cost inherent in current technologies. However, due to the large error rates of nanopore sequencing, single base resolution has not been reached. A very important source of noise is the intrinsic structural noise in the electric signature of the nucleotide arising from the influence of neighboring nucleotides. In this work we perform calculations of the tunneling current through DNA molecules in nanopores using the non-equilibrium electron transport method within an effective multi-orbital tight-binding model derived from first-principles calculations. We develop a base-calling algorithm accounting for the correlations of the current through neighboring bases, which in principle can reduce the error rate below any desired precision. Using this method we show that we can clearly distinguish DNA methylation and other base modifications based on the reading of the tunneling current.

  14. Truncated RAP-MUSIC (TRAP-MUSIC) for MEG and EEG source localization.

    PubMed

    Mäkelä, Niko; Stenroos, Matti; Sarvas, Jukka; Ilmoniemi, Risto J

    2018-02-15

    Electrically active brain regions can be located applying MUltiple SIgnal Classification (MUSIC) on magneto- or electroencephalographic (MEG; EEG) data. We introduce a new MUSIC method, called truncated recursively-applied-and-projected MUSIC (TRAP-MUSIC). It corrects a hidden deficiency of the conventional RAP-MUSIC algorithm, which prevents estimation of the true number of brain-signal sources accurately. The correction is done by applying a sequential dimension reduction to the signal-subspace projection. We show that TRAP-MUSIC significantly improves the performance of MUSIC-type localization; in particular, it successfully and robustly locates active brain regions and estimates their number. We compare TRAP-MUSIC and RAP-MUSIC in simulations with varying key parameters, e.g., signal-to-noise ratio, correlation between source time-courses, and initial estimate for the dimension of the signal space. In addition, we validate TRAP-MUSIC with measured MEG data. We suggest that with the proposed TRAP-MUSIC method, MUSIC-type localization could become more reliable and suitable for various online and offline MEG and EEG applications. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Surface shape analysis with an application to brain surface asymmetry in schizophrenia.

    PubMed

    Brignell, Christopher J; Dryden, Ian L; Gattone, S Antonio; Park, Bert; Leask, Stuart; Browne, William J; Flynn, Sean

    2010-10-01

    Some methods for the statistical analysis of surface shapes and asymmetry are introduced. We focus on a case study where magnetic resonance images of the brain are available from groups of 30 schizophrenia patients and 38 controls, and we investigate large-scale brain surface shape differences. Key aspects of shape analysis are to remove nuisance transformations by registration and to identify which parts of one object correspond with the parts of another object. We introduce maximum likelihood and Bayesian methods for registering brain images and providing large-scale correspondences of the brain surfaces. Brain surface size-and-shape analysis is considered using random field theory, and also dimension reduction is carried out using principal and independent components analysis. Some small but significant differences are observed between the the patient and control groups. We then investigate a particular type of asymmetry called torque. Differences in asymmetry are observed between the control and patient groups, which add strength to other observations in the literature. Further investigations of the midline plane location in the 2 groups and the fitting of nonplanar curved midlines are also considered.

  16. Match-bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2003-01-01

    We introduce a new class of automated proof methods for the termination of rewriting systems on strings. The basis of all these methods is to show that rewriting preserves regular languages. To this end, letters are annotated with natural numbers, called match heights. If the minimal height of all positions in a redex is h+1 then every position in the reduct will get height h+1. In a match-bounded system, match heights are globally bounded. Using recent results on deleting systems, we prove that rewriting by a match-bounded system preserves regular languages. Hence it is decidable whether a given rewriting system has a given match bound. We also provide a sufficient criterion for the abence of a match-bound. The problem of existence of a match-bound is still open. Match-boundedness for all strings can be used as an automated criterion for termination, for match-bounded systems are terminating. This criterion can be strengthened by requiring match-boundedness only for a restricted set of strings, for instance the set of right hand sides of forward closures.

  17. Intelligent Structured Intermittent Auscultation (ISIA): evaluation of a decision-making framework for fetal heart monitoring of low-risk women

    PubMed Central

    2014-01-01

    Background Research-informed fetal monitoring guidelines recommend intermittent auscultation (IA) for fetal heart monitoring for low-risk women. However, the use of cardiotocography (CTG) continues to dominate many institutional maternity settings. Methods A mixed methods intervention study with before and after measurement was undertaken in one secondary level health service to facilitate the implementation of an initiative to encourage the use of IA. The intervention initiative was a decision-making framework called Intelligent Structured Intermittent Auscultation (ISIA) introduced through an education session. Results Following the intervention, medical records review revealed an increase in the use of IA during labour represented by a relative change of 12%, with improved documentation of clinical findings from assessments, and a significant reduction in the risk of receiving an admission CTG (RR 0.75, 95% CI, 0.60 – 0.95, p = 0.016). Conclusion The ISIA informed decision-making framework transformed the practice of IA and provided a mechanism for knowledge translation that enabled midwives to implement evidence-based fetal heart monitoring for low risk women. PMID:24884597

  18. Defining moments in risk communication research: 1996-2005.

    PubMed

    McComas, Katherine A

    2006-01-01

    Ten years ago, scholars suggested that risk communication was embarking on a new phase that would give increased attention to the social contexts that surround and encroach on public responses to risk information. A decade later, many researchers have answered the call, with several defining studies examining the social and psychological influences on risk communication. This article reviews risk communication research appearing in the published literature since 1996. Among studies, social trust, the social amplification of risk framework, and the affect heuristic figured prominently. Also common were studies examining the influence of risk in the mass media. Among these were content analyses of media coverage of risk, as well as investigations of possible effects resulting from coverage. The use of mental models was a dominant method for developing risk message content. Other studies examined the use of risk comparisons, narratives, and visuals in the production of risk messages. Research also examined how providing information about a risk's severity, social norms, and efficacy influenced communication behaviors and intentions to follow risk reduction measures. Methods for conducting public outreach in health risk communication rounded out the literature.

  19. Scenario-Based Case Study Method and the Functionality of the Section Called "From Production to Consumption" from the Perspective of Primary School Students

    ERIC Educational Resources Information Center

    Taneri, Ahu

    2018-01-01

    In this research, the aim was showing the evaluation of students on scenario-based case study method and showing the functionality of the studied section called "from production to consumption". Qualitative research method and content analysis were used to reveal participants' experiences and reveal meaningful relations regarding…

  20. Selective hydrogenation of phenol to cyclohexanone over Pd@CN (N-doped porous carbon): Role of catalyst reduction method

    NASA Astrophysics Data System (ADS)

    Hu, Shuo; Yang, Guangxin; Jiang, Hong; Liu, Yefei; Chen, Rizhi

    2018-03-01

    Selective phenol hydrogenation is a green and sustainable technology to produce cyclohexanone. The work focused on investigating the role of catalyst reduction method in the liquid-phase phenol hydrogenation to cyclohexanone over Pd@CN (N-doped porous carbon). A series of reduction methods including flowing hydrogen reduction, in-situ reaction reduction and liquid-phase reduction were designed and performed. The results highlighted that the reduction method significantly affected the catalytic performance of Pd@CN in the liquid-phase hydrogenation of phenol to cyclohexanone, and the liquid-phase reduction with the addition of appropriate amount of phenol was highly efficient to improve the catalytic activity of Pd@CN. The influence mechanism was explored by a series of characterizations. The results of TEM, XPS and CO chemisorption confirmed that the reduction method mainly affected the size, surface composition and dispersion of Pd in the CN material. The addition of phenol during the liquid-phase reduction could inhibit the aggregation of Pd NPs and promote the reduction of Pd (2+), and then improved the catalytic activity of Pd@CN. The work would aid the development of high-performance Pd@CN catalysts for selective phenol hydrogenation.

  1. Reconciling NOx emissions reductions and ozone trends in ...

    EPA Pesticide Factsheets

    Dynamic evaluation seeks to assess the ability of photochemical models to replicate changes in air quality as emissions and other conditions change. When a model fails to replicate an observed change, a key challenge is to discern whether the discrepancy is caused by errors in meteorological simulations, errors in emission magnitudes and changes, or inaccurate responses of simulated pollutant concentrations to emission changes. In this study, the Community Multiscale Air Quality (CMAQ) model is applied to simulate the ozone (O3) change after the NOx SIP Call and mobile emission controls substantially reduced nitrogen oxides (NOx) emissions in the eastern U.S. from 2002 to 2006. For both modeled and observed O3, changes in episode average daily maximal 8-h O3 were highly correlated (R2 = 0.89) with changes in the 95th percentile, although the magnitudes of reductions increased nonlinearly at high percentile O3 concentrations. Observed downward changes in mean NOx (−11.6 to −2.5 ppb) and 8-h O3 (−10.4 to −4.7 ppb) concentrations in metropolitan areas in the NOx SIP Call region were under-predicted by 31%–64% and 26%–66%, respectively. The under-predicted O3 improvements in the NOx SIP Call region could not be explained by adjusting for temperature biases in the meteorological input, or by considering uncertainties in the chemical reaction rate constants. However, the under-prediction in O3 improvements could be alleviated by 5%–31% by constraining NO

  2. Transition Manifolds of Complex Metastable Systems: Theory and Data-Driven Computation of Effective Dynamics.

    PubMed

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-01-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  3. Glutathione reduces cytotoxicity of polyethyleneimine coated magnetic nanoparticles in CHO cells.

    PubMed

    Strojan, Klemen; Lojk, Jasna; Bregar, Vladimir B; Veranič, Peter; Pavlin, Mojca

    2017-06-01

    Polyethyleneimine (PEI) is a polycationic compound frequently used as a transfection agent. However, cytotoxicity of PEI and PEI-coated nanoparticles (PEI NPs) is still a major obstacle in its use. In this study we report a method for reducing cytotoxicity of PEI NPs by addition of glutathione in NPs synthesis. Glutathione reduced cytotoxic effects for at least 30% and decreased observed oxidative stress response compared to standard formulation. Results showed that the effect was partially due to reduced zeta potential and partially due to protective antioxidant properties of glutathione. Addition of glutathione to cell culture media with concurrent exposure to PEI NPs proved to be insufficient for cytotoxicity reduction. Additionally, we compared internalization pathways of both PEI NPs and GSH NPs. NPs were only found in endosomes and no NPs were found free in the cytosol, as would be expected according to so called proton sponge hypothesis. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. An inexpensive frequency-modulated (FM) audio monitor of time-dependent analog parameters.

    PubMed

    Langdon, R B; Jacobs, R S

    1980-02-01

    The standard method for quantification and presentation of an experimental variable in real time is the use of visual display on the ordinate of an oscilloscope screen or chart recorder. This paper describes a relatively simple electronic circuit, using commercially available and inexpensive integrated circuits (IC), which generates an audible tone, the pitch of which varies in proportion to a running variable of interest. This device, which we call an "Audioscope," can accept as input the monitor output from any instrument that expresses an experimental parameter as a dc voltage. The Audioscope is particularly useful in implanting microelectrodes intracellularly. It may also function to mediate the first step in data recording on magnetic tape, and/or data analysis and reduction by electronic circuitary. We estimate that this device can be built, with two-channel capability, for less than $50, and in less than 10 hr by an experienced electronics technician.

  5. Service Modeling for Service Engineering

    NASA Astrophysics Data System (ADS)

    Shimomura, Yoshiki; Tomiyama, Tetsuo

    Intensification of service and knowledge contents within product life cycles is considered crucial for dematerialization, in particular, to design optimal product-service systems from the viewpoint of environmentally conscious design and manufacturing in advanced post industrial societies. In addition to the environmental limitations, we are facing social limitations which include limitations of markets to accept increasing numbers of mass-produced artifacts and such environmental and social limitations are restraining economic growth. To attack and remove these problems, we need to reconsider the current mass production paradigm and to make products have more added values largely from knowledge and service contents to compensate volume reduction under the concept of dematerialization. Namely, dematerialization of products needs to enrich service contents. However, service was mainly discussed within marketing and has been mostly neglected within traditional engineering. Therefore, we need new engineering methods to look at services, rather than just functions, called "Service Engineering." To establish service engineering, this paper proposes a modeling technique of service.

  6. Monte Carlo-based Reconstruction in Water Cherenkov Detectors using Chroma

    NASA Astrophysics Data System (ADS)

    Seibert, Stanley; Latorre, Anthony

    2012-03-01

    We demonstrate the feasibility of event reconstruction---including position, direction, energy and particle identification---in water Cherenkov detectors with a purely Monte Carlo-based method. Using a fast optical Monte Carlo package we have written, called Chroma, in combination with several variance reduction techniques, we can estimate the value of a likelihood function for an arbitrary event hypothesis. The likelihood can then be maximized over the parameter space of interest using a form of gradient descent designed for stochastic functions. Although slower than more traditional reconstruction algorithms, this completely Monte Carlo-based technique is universal and can be applied to a detector of any size or shape, which is a major advantage during the design phase of an experiment. As a specific example, we focus on reconstruction results from a simulation of the 200 kiloton water Cherenkov far detector option for LBNE.

  7. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    PubMed

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  8. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    PubMed Central

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-01-01

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented. PMID:23948873

  9. The drive for Aircraft Energy Efficiency

    NASA Technical Reports Server (NTRS)

    James, R. L., Jr.; Maddalon, D. V.

    1984-01-01

    NASA's Aircraft Energy Efficiency (ACEE) program, which began in 1976, has mounted a development effort in four major transport aircraft technology fields: laminar flow systems, advanced aerodynamics, flight controls, and composite structures. ACEE has explored two basic methods for achieving drag-reducing boundary layer laminarization: the use of suction through the wing structure (via slots or perforations) to remove boundary layer turbulence, and the encouragement of natural laminar flow maintenance through refined design practices. Wind tunnel tests have been conducted for wide bodied aircraft equipped with high aspect ratio supercritical wings and winglets. Maneuver load control and pitch-active stability augmentation control systems reduce fuel consumption by reducing the drag associated with high aircraft stability margins. Composite structures yield lighter airframes that in turn call for smaller wing and empennage areas, reducing induced drag for a given payload. In combination, all four areas of development are expected to yield a fuel consumption reduction of 40 percent.

  10. An AIDS risk reduction program for Dutch drug users: an intervention mapping approach to planning.

    PubMed

    van Empelen, Pepijn; Kok, Gerjo; Schaalma, Herman P; Bartholomew, L Kay

    2003-10-01

    This article presents the development of a theory- and evidence-based AIDS prevention program targeting Dutch drug users and aimed at promoting condom use. The emphasis is on the development of the program using a five-step intervention development protocol called intervention mapping (IM). Preceding Step 1 of the IM process, an assessment of the HIV problem among drug users was conducted. The product of IM Step 1 was a series of program objectives specifying what drug users should learn in order to use condoms consistently. In Step 2, theoretical methods for influencing the most important determinants were chosen and translated into practical strategies that fit the program objectives. The main strategy chosen was behavioral journalism. In Step 3, leaflets with role-model stories based on authentic interviews with drug users were developed and pilot tested. Finally, the need for cooperation with program users is discussed in IM Steps 4 and 5.

  11. The Wonderful World of Active Many-Particle Systems

    NASA Astrophysics Data System (ADS)

    Helbing, Dirk

    Since the subject of traffic dynamics has captured the interest of physicists, many astonishing effects have been revealed and explained. Some of the questions now understood are the following: Why are vehicles sometimes stopped by so-called ``phantom traffic jams'', although they all like to drive fast? What are the mechanisms behind stop-and-go traffic? Why are there several different kinds of congestion, and how are they related? Why do most traffic jams occur considerably before the road capacity is reached? Can a temporary reduction of the traffic volume cause a lasting traffic jam? Why do pedestrians moving in opposite directions normally organize in lanes, while nervous crowds are ``freezing by heating''? Why do panicking pedestrians produce dangerous deadlocks? All these questions have been answered by applying and extending methods from statistical physics and non-linear dynamics to self-driven many-particle systems.

  12. Transition Manifolds of Complex Metastable Systems

    NASA Astrophysics Data System (ADS)

    Bittracher, Andreas; Koltai, Péter; Klus, Stefan; Banisch, Ralf; Dellnitz, Michael; Schütte, Christof

    2018-04-01

    We consider complex dynamical systems showing metastable behavior, but no local separation of fast and slow time scales. The article raises the question of whether such systems exhibit a low-dimensional manifold supporting its effective dynamics. For answering this question, we aim at finding nonlinear coordinates, called reaction coordinates, such that the projection of the dynamics onto these coordinates preserves the dominant time scales of the dynamics. We show that, based on a specific reducibility property, the existence of good low-dimensional reaction coordinates preserving the dominant time scales is guaranteed. Based on this theoretical framework, we develop and test a novel numerical approach for computing good reaction coordinates. The proposed algorithmic approach is fully local and thus not prone to the curse of dimension with respect to the state space of the dynamics. Hence, it is a promising method for data-based model reduction of complex dynamical systems such as molecular dynamics.

  13. A series connection architecture for large-area organic photovoltaic modules with a 7.5% module efficiency.

    PubMed

    Hong, Soonil; Kang, Hongkyu; Kim, Geunjin; Lee, Seongyu; Kim, Seok; Lee, Jong-Hoon; Lee, Jinho; Yi, Minjin; Kim, Junghwan; Back, Hyungcheol; Kim, Jae-Ryoung; Lee, Kwanghee

    2016-01-05

    The fabrication of organic photovoltaic modules via printing techniques has been the greatest challenge for their commercial manufacture. Current module architecture, which is based on a monolithic geometry consisting of serially interconnecting stripe-patterned subcells with finite widths, requires highly sophisticated patterning processes that significantly increase the complexity of printing production lines and cause serious reductions in module efficiency due to so-called aperture loss in series connection regions. Herein we demonstrate an innovative module structure that can simultaneously reduce both patterning processes and aperture loss. By using a charge recombination feature that occurs at contacts between electron- and hole-transport layers, we devise a series connection method that facilitates module fabrication without patterning the charge transport layers. With the successive deposition of component layers using slot-die and doctor-blade printing techniques, we achieve a high module efficiency reaching 7.5% with area of 4.15 cm(2).

  14. The mediating effect of calling on the relationship between medical school students’ academic burnout and empathy

    PubMed Central

    2017-01-01

    Purpose This study is aimed at identifying the relationships between medical school students’ academic burnout, empathy, and calling, and determining whether their calling has a mediating effect on the relationship between academic burnout and empathy. Methods A mixed method study was conducted. One hundred twenty-seven medical students completed a survey. Scales measuring academic burnout, medical students’ empathy, and calling were utilized. For statistical analysis, correlation analysis, descriptive statistics analysis, and hierarchical multiple regression analyses were conducted. For qualitative approach, eight medical students participated in a focus group interview. Results The study found that empathy has a statistically significant, negative correlation with academic burnout, while having a significant, positive correlation with calling. Sense of calling proved to be an effective mediator of the relationship between academic burnout and empathy. Conclusion This result demonstrates that calling is a key variable that mediates the relationship between medical students’ academic burnout and empathy. As such, this study provides baseline data for an education that could improve medical students’ empathy skills. PMID:28870019

  15. Active Care Management Supported by Home Telemonitoring in Veterans With Type 2 Diabetes

    PubMed Central

    Stone, Roslyn A.; Rao, R. Harsha; Sevick, Mary Ann; Cheng, Chunrong; Hough, Linda J.; Macpherson, David S.; Franko, Carol M.; Anglin, Rebecca A.; Obrosky, D. Scott; DeRubertis, Frederick R.

    2010-01-01

    OBJECTIVE We compared the short-term efficacy of home telemonitoring coupled with active medication management by a nurse practitioner with a monthly care coordination telephone call on glycemic control in veterans with type 2 diabetes and entry A1C ≥7.5%. RESEARCH DESIGN AND METHODS Veterans who received primary care at the VA Pittsburgh Healthcare System from June 2004 to December 2005, who were taking oral hypoglycemic agents and/or insulin for ≥1 year, and who had A1C ≥7.5% at enrollment were randomly assigned to either active care management with home telemonitoring (ACM+HT group, n = 73) or a monthly care coordination telephone call (CC group, n = 77). Both groups received monthly calls for diabetes education and self-management review. ACM+HT group participants transmitted blood glucose, blood pressure, and weight to a nurse practitioner using the Viterion 100 TeleHealth Monitor; the nurse practitioner adjusted medications for glucose, blood pressure, and lipid control based on established American Diabetes Association targets. Measures were obtained at baseline, 3-month, and 6-month visits. RESULTS Baseline characteristics were similar in both groups, with mean A1C of 9.4% (CC group) and 9.6% (ACM+HT group). Compared with the CC group, the ACM+HT group demonstrated significantly larger decreases in A1C at 3 months (1.7 vs. 0.7%) and 6 months (1.7 vs. 0.8%; P < 0.001 for each), with most improvement occurring by 3 months. CONCLUSIONS Compared with the CC group, the ACM+HT group demonstrated significantly greater reductions in A1C by 3 and 6 months. However, both interventions improved glycemic control in primary care patients with previously inadequate control. PMID:20009091

  16. Steepest descent with momentum for quadratic functions is a version of the conjugate gradient method.

    PubMed

    Bhaya, Amit; Kaszkurewicz, Eugenius

    2004-01-01

    It is pointed out that the so called momentum method, much used in the neural network literature as an acceleration of the backpropagation method, is a stationary version of the conjugate gradient method. Connections with the continuous optimization method known as heavy ball with friction are also made. In both cases, adaptive (dynamic) choices of the so called learning rate and momentum parameters are obtained using a control Liapunov function analysis of the system.

  17. Use of Sodium Dithionite as Part of a More Efficient Groundwater Restoration Method Following In-situ Recovery of Uranium at the Smith-Ranch Highland Site in Wyoming

    NASA Astrophysics Data System (ADS)

    Harris, R.; Reimus, P. W.; Ware, D.; Williams, K.; Chu, D.; Perkins, G.; Migdissov, A. A.; Bonwell, C.

    2017-12-01

    Uranium is primarily mined for nuclear power production using an aqueous extraction technique called in-situ recovery (ISR). ISR can pollute groundwater with residual uranium and other heavy metals. Reverse osmosis and groundwater sweep are currently used to restore groundwater after ISR mining, but are not permanent solutions. Sodium dithionite is being tested as part of a method to more permanently restore groundwater after ISR mining at the Smith-Ranch Highland site in Wyoming. Sodium dithionite is a chemical reductant that can reduce sediments that were oxidized during ISR. The reduced sediments can reduce soluble uranium (VI) in the groundwater to insoluble uranium (IV). Laboratory studies that use sodium dithionite to treat sediments and waters from the site may help predict how it will behave during a field deployment. An aqueous batch experiment showed that sodium dithionite reduced uranium in post-mined untreated groundwater from 38 ppm to less than 1 ppm after 1 day. A sediment reduction batch experiment showed that sodium dithionite-treated sediments were capable of reducing uranium in post-mined untreated groundwater from 38 ppm to 2 ppm after 7 days. One column experiment is showing post-mined sodium dithionite-treated sediments are capable of reducing uranium in post-mined groundwater for over 30 pore volumes past the initial injection. While these results are promising for field deployments of sodium dithionite, another column experiment with sodium dithionite-treated sediments containing uranium rich organic matter is showing net production of uranium instead of uranium uptake. Sodium dithionite appears to liberate uranium from the organic matter. Another sediment reduction experiment is being conducted to further investigate this hypothesis. These experiments are helping guide plans for field deployments of sodium dithionite at uranium ISR mining sites.

  18. Techno-economıc Analysıs of Evacuated Tube Solar Water Heater usıng F-chart Method

    NASA Astrophysics Data System (ADS)

    Fayaz, H.; Rahim, N. A.; Saidur, R.; Hasanuzzaman, M.

    2018-05-01

    Solar thermal utilization, especially the application of solar water heater technology, has developed rapidly in recent decades. Solar water heating systems based on thermal collector alone or connected with photovoltaic called as photovoltaic-thermal (PVT) are practical applications to replace the use of electrical water heaters but weather dependent performance of these systems is not linear. Therefore on the basis of short term or average weather conditions, accurate analysis of performance is quite difficult. The objective of this paper is to show thermal and economic analysis of evacuated tube collector solar water heaters. Analysis done by F-Chart shows that evacuated tube solar water heater achieves fraction value of 1 to fulfil hot water demand of 150liters and above per day for a family without any auxiliary energy usage. Evacuated tube solar water heater show life cycle savings of RM 5200. At water set temperature of 100°C, RM 12000 is achieved and highest life cycle savings of RM 6100 at the environmental temperature of 18°C are achieved. Best thermal and economic performance is obtained which results in reduction of household greenhouse gas emissions, reduction of energy consumption and saves money on energy bills.

  19. On Undecidability Aspects of Resilient Computations and Implications to Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S

    2014-01-01

    Future Exascale computing systems with a large number of processors, memory elements and interconnection links, are expected to experience multiple, complex faults, which affect both applications and operating-runtime systems. A variety of algorithms, frameworks and tools are being proposed to realize and/or verify the resilience properties of computations that guarantee correct results on failure-prone computing systems. We analytically show that certain resilient computation problems in presence of general classes of faults are undecidable, that is, no algorithms exist for solving them. We first show that the membership verification in a generic set of resilient computations is undecidable. We describe classesmore » of faults that can create infinite loops or non-halting computations, whose detection in general is undecidable. We then show certain resilient computation problems to be undecidable by using reductions from the loop detection and halting problems under two formulations, namely, an abstract programming language and Turing machines, respectively. These two reductions highlight different failure effects: the former represents program and data corruption, and the latter illustrates incorrect program execution. These results call for broad-based, well-characterized resilience approaches that complement purely computational solutions using methods such as hardware monitors, co-designs, and system- and application-specific diagnosis codes.« less

  20. Error reduction and representation in stages (ERRIS) in hydrological modelling for ensemble streamflow forecasting

    NASA Astrophysics Data System (ADS)

    Li, Ming; Wang, Q. J.; Bennett, James C.; Robertson, David E.

    2016-09-01

    This study develops a new error modelling method for ensemble short-term and real-time streamflow forecasting, called error reduction and representation in stages (ERRIS). The novelty of ERRIS is that it does not rely on a single complex error model but runs a sequence of simple error models through four stages. At each stage, an error model attempts to incrementally improve over the previous stage. Stage 1 establishes parameters of a hydrological model and parameters of a transformation function for data normalization, Stage 2 applies a bias correction, Stage 3 applies autoregressive (AR) updating, and Stage 4 applies a Gaussian mixture distribution to represent model residuals. In a case study, we apply ERRIS for one-step-ahead forecasting at a range of catchments. The forecasts at the end of Stage 4 are shown to be much more accurate than at Stage 1 and to be highly reliable in representing forecast uncertainty. Specifically, the forecasts become more accurate by applying the AR updating at Stage 3, and more reliable in uncertainty spread by using a mixture of two Gaussian distributions to represent the residuals at Stage 4. ERRIS can be applied to any existing calibrated hydrological models, including those calibrated to deterministic (e.g. least-squares) objectives.

  1. Crash probability estimation via quantifying driver hazard perception.

    PubMed

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. GRACE Hydrological estimates for small basins: Evaluating processing approaches on the High Plains Aquifer, USA

    NASA Astrophysics Data System (ADS)

    Longuevergne, Laurent; Scanlon, Bridget R.; Wilson, Clark R.

    2010-11-01

    The Gravity Recovery and Climate Experiment (GRACE) satellites provide observations of water storage variation at regional scales. However, when focusing on a region of interest, limited spatial resolution and noise contamination can cause estimation bias and spatial leakage, problems that are exacerbated as the region of interest approaches the GRACE resolution limit of a few hundred km. Reliable estimates of water storage variations in small basins require compromises between competing needs for noise suppression and spatial resolution. The objective of this study was to quantitatively investigate processing methods and their impacts on bias, leakage, GRACE noise reduction, and estimated total error, allowing solution of the trade-offs. Among the methods tested is a recently developed concentration algorithm called spatiospectral localization, which optimizes the basin shape description, taking into account limited spatial resolution. This method is particularly suited to retrieval of basin-scale water storage variations and is effective for small basins. To increase confidence in derived methods, water storage variations were calculated for both CSR (Center for Space Research) and GRGS (Groupe de Recherche de Géodésie Spatiale) GRACE products, which employ different processing strategies. The processing techniques were tested on the intensively monitored High Plains Aquifer (450,000 km2 area), where application of the appropriate optimal processing method allowed retrieval of water storage variations over a portion of the aquifer as small as ˜200,000 km2.

  3. Preference for a Stimulus that Follows a Relatively Aversive Event: Contrast or Delay Reduction?

    ERIC Educational Resources Information Center

    Singer, Rebecca A.; Berry, Laura M.; Zentall, Thomas R.

    2007-01-01

    Several types of contrast effects have been identified including incentive contrast, anticipatory contrast, and behavioral contrast. Clement, Feltus, Kaiser, and Zentall (2000) proposed a type of contrast that appears to be different from these others and called it within-trial contrast. In this form of contrast the relative value of a reinforcer…

  4. 76 FR 5377 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-31

    ... (CDC) publishes a list of information collection requests under review by the Office of Management and... these requests, call the CDC Reports Clearance Officer at (404) 639-5960 or send an e-mail to [email protected]cdc.gov . Send written comments to CDC Desk Officer, Office of Management and Budget, Washington, DC or by...

  5. 77 FR 51809 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-27

    ... (CDC) publishes a list of information collection requests under review by the Office of Management and... these requests, call the CDC Reports Clearance Officer at (404) 639-7570 or send an email to [email protected]cdc.gov . Send written comments to CDC Desk Officer, Office of Management and Budget, Washington, DC or by fax to...

  6. Taken out of Context?: Examining the Influence of Context on Teachers' Written Responses to Student Writing

    ERIC Educational Resources Information Center

    Bowles, Bruce L.

    2016-01-01

    Although response scholarship has continually called for a greater emphasis on context when analyzing instructors' written commentary on student writing, textual analysis of written comments remains a primary direction for response research. Additionally, when context is accounted for, it is oftentimes done so in a rather reductive fashion, with a…

  7. RSV-free formulation of quantum mondemolition theory

    NASA Astrophysics Data System (ADS)

    Lynch, Robert

    1982-10-01

    The entire validity of the “quantum nondemolition” (QND) concept has been called into question because of its deep reliance on “reduction of the state vector” (RSV) in the detailed development of the theory. In this letter QND theory is reformulated without use of RSV, except as found in the overall interpretation of the wave function.

  8. Poverty Diagnostics Using Poor Data: Strengthening the Evidence Base for Pro-Poor Policy Making in Lesotho

    ERIC Educational Resources Information Center

    May, Julian; Roberts, Benjamin

    2005-01-01

    Increasingly national statistical agencies are being called upon to provide high quality data on a regular basis, to be used by governments for evidence-based policy development. Poverty Reduction Strategy Papers (PRSPs) give impetus to this, and bring a prerequisite for comprehensive "poverty diagnosis." Often the data that are required…

  9. 76 FR 13191 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-10

    .... Proposed Project FoodNet Non-O157 Shiga toxin-Producing E. coli Study: Assessment of Risk Factors for... (CDC). Background and Brief Description Each year many Shiga toxin-producing E. coli (STEC) infections... these requests, call the CDC Reports Clearance Officer at (404) 639-5960 or send an e-mail to om[email protected

  10. Origins and scales of hypoxia on the Louisiana shelf: importance of seasonal plankton dynamics and river nutrients and discharge

    EPA Science Inventory

    Management plans for the Mississippi River Basin call for reductions in nutrient concentrations up to 40% or more to reduce hypoxia in the Gulf of Mexico (GOM), while at the same time the government is considering new farm subsidies to promote development of biofuels from corn. T...

  11. Downsizing of Central Office: Does Anyone Care? Pre-Conference Draft.

    ERIC Educational Resources Information Center

    Berg, Judith; Hall, Gene

    Four years ago, the Colorado education system embarked on a course to downsize central offices in response to calls for accountability and site-based decision making. This paper presents findings of a study that examined restructuring and downsizing in four Colorado school districts. One consequence of downsizing was a reduction in force at the…

  12. NIMS Certification Addresses U.S. Need for Skilled Workers

    ERIC Educational Resources Information Center

    Dereu, Brian

    2010-01-01

    The current state of manufacturing in the United States calls for serious attention and action. For many years, U.S. manufacturers have complained of a shortage of skilled workers--and the federal Bureau of Labor Statistics backs up their concerns. One can place blame for reductions in American manufacturing on many things, but certainly the lack…

  13. Old Tails and New Trails in High Dimensions

    ERIC Educational Resources Information Center

    Halevy, Avner

    2013-01-01

    We discuss the motivation for dimension reduction in the context of the modern data revolution and introduce a key result in this field, the Johnson-Lindenstrauss flattening lemma. Then we leap into high-dimensional space for a glimpse of the phenomenon called concentration of measure, and use it to sketch a proof of the lemma. We end by tying…

  14. Campus-Wide Measures Have Greater Potential | Climate Neutral Research

    Science.gov Websites

    impacts of climate change and fossil-fuel depletion. International scientific bodies addressing climate Potential Pursuing climate neutrality on research campuses fits into the bigger picture of addressing the change are calling for reductions of carbon emissions of 80% by 2050. Because of their size and

  15. On the combinatorics of sparsification.

    PubMed

    Huang, Fenix Wd; Reidys, Christian M

    2012-10-22

    We study the sparsification of dynamic programming based on folding algorithms of RNA structures. Sparsification is a method that improves significantly the computation of minimum free energy (mfe) RNA structures. We provide a quantitative analysis of the sparsification of a particular decomposition rule, Λ∗. This rule splits an interval of RNA secondary and pseudoknot structures of fixed topological genus. Key for quantifying sparsifications is the size of the so called candidate sets. Here we assume mfe-structures to be specifically distributed (see Assumption 1) within arbitrary and irreducible RNA secondary and pseudoknot structures of fixed topological genus. We then present a combinatorial framework which allows by means of probabilities of irreducible sub-structures to obtain the expectation of the Λ∗-candidate set w.r.t. a uniformly random input sequence. We compute these expectations for arc-based energy models via energy-filtered generating functions (GF) in case of RNA secondary structures as well as RNA pseudoknot structures. Furthermore, for RNA secondary structures we also analyze a simplified loop-based energy model. Our combinatorial analysis is then compared to the expected number of Λ∗-candidates obtained from the folding mfe-structures. In case of the mfe-folding of RNA secondary structures with a simplified loop-based energy model our results imply that sparsification provides a significant, constant improvement of 91% (theory) to be compared to an 96% (experimental, simplified arc-based model) reduction. However, we do not observe a linear factor improvement. Finally, in case of the "full" loop-energy model we can report a reduction of 98% (experiment). Sparsification was initially attributed a linear factor improvement. This conclusion was based on the so called polymer-zeta property, which stems from interpreting polymer chains as self-avoiding walks. Subsequent findings however reveal that the O(n) improvement is not correct. The combinatorial analysis presented here shows that, assuming a specific distribution (see Assumption 1), of mfe-structures within irreducible and arbitrary structures, the expected number of Λ∗-candidates is Θ(n2). However, the constant reduction is quite significant, being in the range of 96%. We furthermore show an analogous result for the sparsification of the Λ∗-decomposition rule for RNA pseudoknotted structures of genus one. Finally we observe that the effect of sparsification is sensitive to the employed energy model.

  16. The assessment of biases in the acoustic discrimination of individuals

    PubMed Central

    Šálek, Martin

    2017-01-01

    Animal vocalizations contain information about individual identity that could potentially be used for the monitoring of individuals. However, the performance of individual discrimination is subjected to many biases depending on factors such as the amount of identity information, or methods used. These factors need to be taken into account when comparing results of different studies or selecting the most cost-effective solution for a particular species. In this study, we evaluate several biases associated with the discrimination of individuals. On a large sample of little owl male individuals, we assess how discrimination performance changes with methods of call description, an increasing number of individuals, and number of calls per male. Also, we test whether the discrimination performance within the whole population can be reliably estimated from a subsample of individuals in a pre-screening study. Assessment of discrimination performance at the level of the individual and at the level of call led to different conclusions. Hence, studies interested in individual discrimination should optimize methods at the level of individuals. The description of calls by their frequency modulation leads to the best discrimination performance. In agreement with our expectations, discrimination performance decreased with population size. Increasing the number of calls per individual linearly increased the discrimination of individuals (but not the discrimination of calls), likely because it allows distinction between individuals with very similar calls. The available pre-screening index does not allow precise estimation of the population size that could be reliably monitored. Overall, projects applying acoustic monitoring at the individual level in population need to consider limitations regarding the population size that can be reliably monitored and fine-tune their methods according to their needs and limitations. PMID:28486488

  17. Robust Derivation of Risk Reduction Strategies

    NASA Technical Reports Server (NTRS)

    Richardson, Julian; Port, Daniel; Feather, Martin

    2007-01-01

    Effective risk reduction strategies can be derived mechanically given sufficient characterization of the risks present in the system and the effectiveness of available risk reduction techniques. In this paper, we address an important question: can we reliably expect mechanically derived risk reduction strategies to be better than fixed or hand-selected risk reduction strategies, given that the quantitative assessment of risks and risk reduction techniques upon which mechanical derivation is based is difficult and likely to be inaccurate? We consider this question relative to two methods for deriving effective risk reduction strategies: the strategic method defined by Kazman, Port et al [Port et al, 2005], and the Defect Detection and Prevention (DDP) tool [Feather & Cornford, 2003]. We performed a number of sensitivity experiments to evaluate how inaccurate knowledge of risk and risk reduction techniques affect the performance of the strategies computed by the Strategic Method compared to a variety of alternative strategies. The experimental results indicate that strategies computed by the Strategic Method were significantly more effective than the alternative risk reduction strategies, even when knowledge of risk and risk reduction techniques was very inaccurate. The robustness of the Strategic Method suggests that its use should be considered in a wide range of projects.

  18. Observations and Bayesian location methodology of transient acoustic signals (likely blue whales) in the Indian Ocean, using a hydrophone triplet.

    PubMed

    Le Bras, Ronan J; Kuzma, Heidi; Sucic, Victor; Bokelmann, Götz

    2016-05-01

    A notable sequence of calls was encountered, spanning several days in January 2003, in the central part of the Indian Ocean on a hydrophone triplet recording acoustic data at a 250 Hz sampling rate. This paper presents signal processing methods applied to the waveform data to detect, group, extract amplitude and bearing estimates for the recorded signals. An approximate location for the source of the sequence of calls is inferred from extracting the features from the waveform. As the source approaches the hydrophone triplet, the source level (SL) of the calls is estimated at 187 ± 6 dB re: 1 μPa-1 m in the 15-60 Hz frequency range. The calls are attributed to a subgroup of blue whales, Balaenoptera musculus, with a characteristic acoustic signature. A Bayesian location method using probabilistic models for bearing and amplitude is demonstrated on the calls sequence. The method is applied to the case of detection at a single triad of hydrophones and results in a probability distribution map for the origin of the calls. It can be extended to detections at multiple triads and because of the Bayesian formulation, additional modeling complexity can be built-in as needed.

  19. Interspecific Semantic Alarm Call Recognition in the Solitary Sahamalaza Sportive Lemur, Lepilemur sahamalazensis

    PubMed Central

    Seiler, Melanie; Schwitzer, Christoph; Gamba, Marco; Holderied, Marc W.

    2013-01-01

    As alarm calls indicate the presence of predators, the correct interpretation of alarm calls, including those of other species, is essential for predator avoidance. Conversely, communication calls of other species might indicate the perceived absence of a predator and hence allow a reduction in vigilance. This “eavesdropping” was demonstrated in birds and mammals, including lemur species. Interspecific communication between taxonomic groups has so far been reported in some reptiles and mammals, including three primate species. So far, neither semantic nor interspecific communication has been tested in a solitary and nocturnal lemur species. The aim of this study was to investigate if the nocturnal and solitary Sahamalaza sportive lemur, Lepilemur sahamalazensis, is able to access semantic information of sympatric species. During the day, this species faces the risk of falling prey to aerial and terrestrial predators and therefore shows high levels of vigilance. We presented alarm calls of the crested coua, the Madagascar magpie-robin and aerial, terrestrial and agitation alarm calls of the blue-eyed black lemur to 19 individual Sahamalaza sportive lemurs resting in tree holes. Songs of both bird species’ and contact calls of the blue-eyed black lemur were used as a control. After alarm calls of crested coua, Madagascar magpie-robin and aerial alarm of the blue-eyed black lemur, the lemurs scanned up and their vigilance increased significantly. After presentation of terrestrial alarm and agitation calls of the blue-eyed black lemur, the animals did not show significant changes in scanning direction or in the duration of vigilance. Sportive lemur vigilance decreased after playbacks of songs of the bird species and contact calls of blue-eyed black lemurs. Our results indicate that the Sahamalaza sportive lemur is capable of using information on predator presence as well as predator type of different sympatric species, using their referential signals to detect predators early, and that the lemurs’ reactions are based on experience and learning. PMID:23825658

  20. Delivering data reduction pipelines to science users

    NASA Astrophysics Data System (ADS)

    Freudling, Wolfram; Romaniello, Martino

    2016-07-01

    The European Southern Observatory has a long history of providing specialized data processing algorithms, called recipes, for most of its instruments. These recipes are used for both operational purposes at the observatory sites, and for data reduction by the scientists at their home institutions. The two applications require substantially different environments for running and controlling the recipes. In this papers, we describe the ESOReflex environment that is used for running recipes on the users' desktops. ESOReflex is a workflow driven data reduction environment. It allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection of and interaction with the data. It includes fully automatic data organization and visualization, interaction with recipes, and the exploration of the provenance tree of intermediate and final data products. ESOReflex uses a number of innovative concepts that have been described in Ref. 1. In October 2015, the complete system was released to the public. ESOReflex allows highly efficient data reduction, using its internal bookkeeping database to recognize and skip previously completed steps during repeated processing of the same or similar data sets. It has been widely adopted by the science community for the reduction of VLT data.

  1. Reduction of Residual Stresses and Distortion in Girth Welded Pipes.

    DTIC Science & Technology

    1987-06-01

    a material that is otherwise ductile. [7] * Generally, it is believed that pure metals do not crack as a result of stress corrosion. Some alloys are...N:AMEFNAME FE-:: ’ NI iW C S 1ŘC - iL’I-N T L-_ IFORMz=’FORMATTED’) 0026 TYPE ’ DOI Yt0U WANT AN INF0RMA-T[ 1 I C 1) N -IE (:I,: F B C! V N "’" 0027... SI3 -S0)*lO.+( ITI-ITO)/6. 0072 IF(TIII.LT.T(JJ)) GO TO 1 0074 CALL GTIM(ITMl) 0075 CALL CVTTIi1(IrFMIPI~il1lIIITl) 0076 TIM=( Ill-IHO)*36000.+( IM

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biyikli, Emre; To, Albert C., E-mail: albertto@pitt.edu

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with themore » associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3–8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.« less

  3. Multiresolution molecular mechanics: Implementation and efficiency

    NASA Astrophysics Data System (ADS)

    Biyikli, Emre; To, Albert C.

    2017-01-01

    Atomistic/continuum coupling methods combine accurate atomistic methods and efficient continuum methods to simulate the behavior of highly ordered crystalline systems. Coupled methods utilize the advantages of both approaches to simulate systems at a lower computational cost, while retaining the accuracy associated with atomistic methods. Many concurrent atomistic/continuum coupling methods have been proposed in the past; however, their true computational efficiency has not been demonstrated. The present work presents an efficient implementation of a concurrent coupling method called the Multiresolution Molecular Mechanics (MMM) for serial, parallel, and adaptive analysis. First, we present the features of the software implemented along with the associated technologies. The scalability of the software implementation is demonstrated, and the competing effects of multiscale modeling and parallelization are discussed. Then, the algorithms contributing to the efficiency of the software are presented. These include algorithms for eliminating latent ghost atoms from calculations and measurement-based dynamic balancing of parallel workload. The efficiency improvements made by these algorithms are demonstrated by benchmark tests. The efficiency of the software is found to be on par with LAMMPS, a state-of-the-art Molecular Dynamics (MD) simulation code, when performing full atomistic simulations. Speed-up of the MMM method is shown to be directly proportional to the reduction of the number of the atoms visited in force computation. Finally, an adaptive MMM analysis on a nanoindentation problem, containing over a million atoms, is performed, yielding an improvement of 6.3-8.5 times in efficiency, over the full atomistic MD method. For the first time, the efficiency of a concurrent atomistic/continuum coupling method is comprehensively investigated and demonstrated.

  4. Noise reduction in optically controlled quantum memory

    NASA Astrophysics Data System (ADS)

    Ma, Lijun; Slattery, Oliver; Tang, Xiao

    2018-05-01

    Quantum memory is an essential tool for quantum communications systems and quantum computers. An important category of quantum memory, called optically controlled quantum memory, uses a strong classical beam to control the storage and re-emission of a single-photon signal through an atomic ensemble. In this type of memory, the residual light from the strong classical control beam can cause severe noise and degrade the system performance significantly. Efficiently suppressing this noise is a requirement for the successful implementation of optically controlled quantum memories. In this paper, we briefly introduce the latest and most common approaches to quantum memory and review the various noise-reduction techniques used in implementing them.

  5. Drag Reduction Devices for Aircraft (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the modeling, application, testing, and development of drag reduction devices for aircraft. Slots, flaps, fences, large-eddy breakup (LEBU) devices, vortex generators and turbines, Helmholtz resonators, and winglets are among the devices discussed. Contour shaping to ensure laminar flow, control boundary layer transition, or minimize turbulence is also covered. Applications include the wings, nacelles, fuselage, empennage, and externals of aircraft designed for high-lift, subsonic, or supersonic operation. The design, testing, and development of directional grooves, commonly called riblets, are covered in a separate bibliography.(Contains 50-250 citations and includes a subject term index and title list.)

  6. Science in a Post-Sendai World

    NASA Astrophysics Data System (ADS)

    Brosnan, D. M.

    2015-12-01

    Adopted at the U.N. Conference on March 18, 2015 in Sendai Japan, the international framework for Disaster Risk Reduction (DRR) will guide how nations across the world address disasters and hazards for the next fifteen years. The agreement, reached after several years of negotiation, marks a shift in thinking and approach to DRR. Traditionally DRR has been the domain of humanitarian responses and methods have been well honed over the decades. However, a defining element of this agreement is the stronger recognition of the role that science can play in preparing for, managing, and mitigating disasters. The framework identifies four priority areas: understanding disaster risk; strengthening disaster risk governance to manage disaster risk; investing in disaster risk reduction for resilience; and enhancing disaster preparedness for effective response and to "build back better" in recovery rehabilitation and reconstruction. Science can underpin each one. For example, the first priority to better understand risks will require scientific and technological input. In addition embedded throughout the Framework are calls for several other specific actions including, dedicated scientific and technical work on disaster risk reduction; mobilization. The challenge moving forward will be to move from rhetoric to action. Are governments ready to embrace the scientific community's input or are many still resistant? What, if any, structures are in place to ensure that the necessary science is carried out and then heard by those who can use it? What steps can scientists and scientific organizations take to ensure the role of science and make their efforts are effective? How science can respond to the opportunities and challenges in a Post-Sendai world will be discussed in the presentation.

  7. Results of the Medicare Health Support disease-management pilot program.

    PubMed

    McCall, Nancy; Cromwell, Jerry

    2011-11-03

    In the Medicare Modernization Act of 2003, Congress required the Centers for Medicare and Medicaid Services to test the commercial disease-management model in the Medicare fee-for-service program. The Medicare Health Support Pilot Program was a large, randomized study of eight commercial programs for disease management that used nurse-based call centers. We randomly assigned patients with heart failure, diabetes, or both to the intervention or to usual care (control) and compared them with the use of a difference-in-differences method to evaluate the effects of the commercial programs on the quality of clinical care, acute care utilization, and Medicare expenditures for Medicare fee-for-service beneficiaries. The study included 242,417 patients (163,107 in the intervention group and 79,310 in the control group). The eight commercial disease-management programs did not reduce hospital admissions or emergency room visits, as compared with usual care. We observed only 14 significant improvements in process-of-care measures out of 40 comparisons. These modest improvements came at substantial cost to the Medicare program in fees paid to the disease-management companies ($400 million), with no demonstrable savings in Medicare expenditures. In this large study, commercial disease-management programs using nurse-based call centers achieved only modest improvements in quality-of-care measures, with no demonstrable reduction in the utilization of acute care or the costs of care.

  8. E-cigarettes: a need to broaden the debate.

    PubMed

    Latif, E; Nair, M

    2016-11-01

    The unregulated market for e-cigarettes continues to grow, with debates on their efficacy and impact on global public health. E-cigarettes, or electronic nicotine delivery systems (ENDs), are marketed as a 'safe' alternative to tobacco products and a tool for 'harm reduction'. Some public health experts are calling it a 'game changer' and favour the 'harm reduction' strategy, while others dispute this claim. In our opinion, the debate needs to be broadened to encompass other related concerns and effects on non-users and affected stakeholders. As with tobacco control, a holistic approach is needed to build a raft of policies that effectively address the issue from all angles and look beyond the direct health implications of e-cigarette use to explore the social, economic, political and environmental aspects of this debate, putting 'harm reduction' in context.

  9. Cluster assembly in nitrogenase.

    PubMed

    Sickerman, Nathaniel S; Rettberg, Lee A; Lee, Chi Chung; Hu, Yilin; Ribbe, Markus W

    2017-05-09

    The versatile enzyme system nitrogenase accomplishes the challenging reduction of N 2 and other substrates through the use of two main metalloclusters. For molybdenum nitrogenase, the catalytic component NifDK contains the [Fe 8 S 7 ]-core P-cluster and a [MoFe 7 S 9 C-homocitrate] cofactor called the M-cluster. These chemically unprecedented metalloclusters play a critical role in the reduction of N 2 , and both originate from [Fe 4 S 4 ] clusters produced by the actions of NifS and NifU. Maturation of P-cluster begins with a pair of these [Fe 4 S 4 ] clusters on NifDK called the P*-cluster. An accessory protein NifZ aids in P-cluster fusion, and reductive coupling is facilitated by NifH in a stepwise manner to form P-cluster on each half of NifDK. For M-cluster biosynthesis, two [Fe 4 S 4 ] clusters on NifB are coupled with a carbon atom in a radical-SAM dependent process, and concomitant addition of a 'ninth' sulfur atom generates the [Fe 8 S 9 C]-core L-cluster. On the scaffold protein NifEN, L-cluster is matured to M-cluster by the addition of Mo and homocitrate provided by NifH. Finally, matured M-cluster in NifEN is directly transferred to NifDK, where a conformational change locks the cofactor in place. Mechanistic insights into these fascinating biosynthetic processes are detailed in this chapter. © 2017 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  10. Potential reduction exposure products and FDA tobacco and regulation: a CNS call to action.

    PubMed

    Heath, Janie; Andrews, Jeannette; Balkstra, Cindy R

    2004-01-01

    A new generation of tobacco harm reduction products is stirring controversy and confusion among healthcare providers. These products, known as "potential reduction exposure products" (PREPs), can be described in terms of reported scientific evidence, as "the good, the bad, and the ugly." On the good side, there is sufficient scientific evidence to support the use of Commit, a new over-the-counter nicotine lozenge PREP, approved for smoking cessation. On the bad side, there is no scientific evidence to support the use of Ariva, another over-the-counter nicotine lozenge PREP, marketed as an alternative to cigarettes when smoking is restricted. On the ugly side, both of these PREPs are nicotine delivery systems with "candy-like" appearances; however, one (Commit) has the Food and Drug Administration (FDA) approval and the other (Ariva) does not. This article provides an overview of PREPs and strategies to help clinical nurse specialists (CNSs) address tobacco harm reduction issues.

  11. Harm reduction at the crossroads: the case of e-cigarettes.

    PubMed

    Maziak, Wasim

    2014-10-01

    The recent popularity of electronic (e)-cigarettes and their rapid uptake by youth has ignited the debate about their role as a harm-reduction strategy. Harm reduction in the context of tobacco control contends that in societies that have achieved considerable success in curbing smoking, leaving the remaining hard-to-quit smokers with an abstinence-only option is unfair, especially when less-harmful choices are available. On one side of the debate are those who call for caution in endorsing such products until critical pieces of evidence about their safety and potential become available, whereas the other side argues that waiting until all questions about e-cigarettes are answered is dogma driven. In this piece, I try to discuss the unresolvable contention between harm-reduction goals of offering safer options to smokers, and those of e-cigarette makers of being commercially viable and profitable. Copyright © 2014 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  12. High-Resolution Wind Measurements for Offshore Wind Energy Development

    NASA Technical Reports Server (NTRS)

    Nghiem, Son V.; Neumann, Gregory

    2011-01-01

    A mathematical transform, called the Rosette Transform, together with a new method, called the Dense Sampling Method, have been developed. The Rosette Transform is invented to apply to both the mean part and the fluctuating part of a targeted radar signature using the Dense Sampling Method to construct the data in a high-resolution grid at 1-km posting for wind measurements over water surfaces such as oceans or lakes.

  13. Passive Acoustic Methods for Tracking Marine Mammals Using Widely-Spaced Bottom-Mounted Hydrophones

    DTIC Science & Technology

    2011-10-26

    standard time-of-arrival (TOA) tracking methods fail. Clicks and long duration calls (whistles or baleen whale calls) were both considered. Methods...Evaluation Center (AUTEC) and the Pacific Missile Range Facility (PMRF). Beaked whales , minke whales , humpback whales , and sperm whales were the main species...of interest. io. auBJEUi i Lmvia Passive acoustic monitoring, localization, tracking, minke whale , beaked whale , sperm whale , humpback whale

  14. 12 CFR 334.25 - Reasonable and simple methods of opting out.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...

  15. 12 CFR 334.25 - Reasonable and simple methods of opting out.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...

  16. 12 CFR 334.25 - Reasonable and simple methods of opting out.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...

  17. 12 CFR 334.25 - Reasonable and simple methods of opting out.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...

  18. 12 CFR 334.25 - Reasonable and simple methods of opting out.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... or processed at an Internet Web site, if the consumer agrees to the electronic delivery of information; (iv) Providing a toll-free telephone number that consumers may call to opt out; or (v) Allowing... by calling a single toll-free telephone number. (2) Opt-out methods that are not reasonable and...

  19. Flexible Delivery as a "Whole-Organisation": What Does This Mean in Practice?

    ERIC Educational Resources Information Center

    Henry, John; Wakefield, Lyn

    A research project called Support Services for Flexible Delivery was commissioned by the Australian organization TAFE (technical and further education) Frontiers. Since 1995, the project has been conducted by using a research approach called the Generalizations from Case Studies (GCS) research method. The GCS method was developed, tested, and…

  20. Conduits to care: call lights and patients' perceptions of communication.

    PubMed

    Montie, Mary; Shuman, Clayton; Galinato, Jose; Patak, Lance; Anderson, Christine A; Titler, Marita G

    2017-01-01

    Call light systems remain the primary means of hospitalized patients to initiate communication with their health care providers. Although there is vast amounts of literature discussing patient communication with their health care providers, few studies have explored patients' perceptions concerning call light use and communication. The specific aim of this study was to solicit patients' perceptions regarding their call light use and communication with nursing staff. Patients invited to this study met the following inclusion criteria: proficient in English, been hospitalized for at least 24 hours, aged ≥21 years, and able to communicate verbally (eg, not intubated). Thirty participants provided written informed consent, were enrolled in the study, and completed interviews. Using qualitative descriptive methods, five major themes emerged from patients' perceptions (namely; establishing connectivity, participant safety concerns, no separation: health care and the call light device, issues with the current call light, and participants' perceptions of "nurse work"). Multiple minor themes supported these major themes. Data analysis utilized the constant comparative methods of Glaser and Strauss. Findings from this study extend the knowledge of patients' understanding of not only why inconsistencies occur between the call light and their nurses, but also why the call light is more than merely a device to initiate communication; rather, it is a direct conduit to their health care and its delivery.

Top